Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add mow compact key #16

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open

Add mow compact key #16

wants to merge 4 commits into from

Conversation

mymeiyi
Copy link
Owner

@mymeiyi mymeiyi commented Feb 18, 2025

No description provided.

Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@mymeiyi mymeiyi force-pushed the add-mow-compact-key branch from 6cbdf08 to 138f965 Compare February 21, 2025 09:48
Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


mymeiyi pushed a commit that referenced this pull request Feb 24, 2025
WRITE of size 1 at 0x6160007e86f0 thread T1983 (Pipe_normal [wo)
#0 0x55fc8065b975 in std::__atomic_base<bool>::store(bool,
std::memory_order)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/atomic_base.h:457:2
#1 0x55fc8065b975 in std::__atomic_base<bool>::operator=(bool)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/atomic_base.h:349:2
#2 0x55fc8065b975 in std::atomic<bool>::operator=(bool)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/atomic:80:22
#3 0x55fc8065b975 in doris::pipeline::PipelineTask::set_running(bool)
/root/doris/be/src/pipeline/pipeline_task.h:192:47
#4 0x55fc8065b975 in
doris::pipeline::TaskScheduler::_do_work(int)::$_0::operator()() const
/root/doris/be/src/pipeline/task_scheduler.cpp:121:23
#5 0x55fc8065b975 in
doris::Defer<doris::pipeline::TaskScheduler::_do_work(int)::$_0>::~Defer()
/root/doris/be/src/util/defer_op.h:37:16
#6 0x55fc8065b975 in doris::pipeline::TaskScheduler::_do_work(int)
/root/doris/be/src/pipeline/task_scheduler.cpp:162:5
#7 0x55fc4c57cd19 in doris::ThreadPool::dispatch_thread()
/root/doris/be/src/util/threadpool.cpp:608:24
#8 0x55fc4c55395e in std::function<void ()>::operator()() const
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/std_function.h:560:9
#9 0x55fc4c55395e in doris::Thread::supervise_thread(void*)
/root/doris/be/src/util/thread.cpp:498:5
#10 0x7f9ee3d25608 in start_thread
/build/glibc-SzIz7B/glibc-2.31/nptl/pthread_create.c:477:8
#11 0x7f9ee3fd2132 in __clone
/build/glibc-SzIz7B/glibc-2.31/misc/../sysdeps/unix/sysv/linux/x86_64/clone.S:95

0x6160007e86f0 is located 624 bytes inside of 632-byte region
[0x6160007e8480,0x6160007e86f8)
freed by thread T1981 (Pipe_normal [wo) here:
#0 0x55fc47aa680d in operator delete(void*)
(/mnt/ssd01/pipline/OpenSourceDoris/clusterEnv/P0/Cluster0/be/lib/doris_be+0x3376e80d)
(BuildId: 865149e62959581e)
#1 0x55fc8059db84 in
std::default_delete<doris::pipeline::PipelineTask>::operator()(doris::pipeline::PipelineTask*)
const
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/unique_ptr.h:85:2
#2 0x55fc8059db84 in std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >::~unique_ptr()
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/unique_ptr.h:361:4
#3 0x55fc8059db84 in void
std::destroy_at<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >
>(std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_construct.h:88:15
#4 0x55fc8059db84 in void
std::_Destroy<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >
>(std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_construct.h:138:7
#5 0x55fc8059db84 in void
std::_Destroy_aux<false>::__destroy<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask>
>*>(std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*,
std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_construct.h:152:6
#6 0x55fc8059db84 in void
std::_Destroy<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask>
>*>(std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*,
std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_construct.h:184:7
#7 0x55fc8059db84 in void
std::_Destroy<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*,
std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >
>(std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*,
std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >*,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > >&)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/alloc_traits.h:746:7
#8 0x55fc8059db84 in
std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >::~vector()
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_vector.h:680:2
#9 0x55fc8052571c in void
std::destroy_at<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >
>(std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_construct.h:88:15
#10 0x55fc8052571c in void
std::_Destroy<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >
>(std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_construct.h:138:7
#11 0x55fc8052571c in void
std::_Destroy_aux<false>::__destroy<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > >
>*>(std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*,
std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_construct.h:152:6
#12 0x55fc8052571c in void
std::_Destroy<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > >
>*>(std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*,
std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_construct.h:184:7
#13 0x55fc8052571c in void
std::_Destroy<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*,
std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >
>(std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*,
std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*,
std::allocator<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > > >&)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/alloc_traits.h:746:7
#14 0x55fc8052571c in
std::vector<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >,
std::allocator<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > > >
>::_M_erase_at_end(std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >*)
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_vector.h:1796:6
#15 0x55fc8052571c in
std::vector<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > >,
std::allocator<std::vector<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> >,
std::allocator<std::unique_ptr<doris::pipeline::PipelineTask,
std::default_delete<doris::pipeline::PipelineTask> > > > > >::clear()
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/stl_vector.h:1499:9
#16 0x55fc8052571c in
doris::pipeline::PipelineFragmentContext::~PipelineFragmentContext()
/root/doris/be/src/pipeline/pipeline_fragment_context.cpp:142:12
#17 0x55fc47ad30cc in
std::_Sp_counted_base<(__gnu_cxx::_Lock_policy)2>::_M_release()
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/shared_ptr_base.h:168:6
#18 0x55fc80658d57 in
std::__shared_count<(__gnu_cxx::_Lock_policy)2>::~__shared_count()
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/shared_ptr_base.h:702:11
#19 0x55fc80658d57 in std::__shared_ptr<doris::TaskExecutionContext,
(__gnu_cxx::_Lock_policy)2>::~__shared_ptr()
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/shared_ptr_base.h:1149:31
#20 0x55fc80658d57 in
doris::pipeline::close_task(doris::pipeline::PipelineTask*,
doris::Status) /root/doris/be/src/pipeline/task_scheduler.cpp:100:1
#21 0x55fc8065aa17 in doris::pipeline::TaskScheduler::_do_work(int)
/root/doris/be/src/pipeline/task_scheduler.cpp:160:36
apache#22 0x55fc4c57cd19 in doris::ThreadPool::dispatch_thread()
/root/doris/be/src/util/threadpool.cpp:608:24
apache#23 0x55fc4c55395e in std::function<void ()>::operator()() const
/var/local/ldb-toolchain/bin/../lib/gcc/x86_64-linux-gnu/11/../../../../include/c++/11/bits/std_function.h:560:9
apache#24 0x55fc4c55395e in doris::Thread::supervise_thread(void*)
/root/doris/be/src/util/thread.cpp:498:5
apache#25 0x7f9ee3d25608 in start_thread
/build/glibc-SzIz7B/glibc-2.31/nptl/pthread_create.c:477:8
@mymeiyi mymeiyi force-pushed the add-mow-compact-key branch from 9676ce8 to 14e971b Compare February 24, 2025 02:50
Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@mymeiyi mymeiyi force-pushed the add-mow-compact-key branch from 0c98f11 to 4b9875b Compare February 25, 2025 09:39
Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@mymeiyi mymeiyi force-pushed the add-mow-compact-key branch from 58ad73c to 8984bf0 Compare February 26, 2025 04:06
Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@mymeiyi mymeiyi force-pushed the add-mow-compact-key branch from e38525d to 282de53 Compare February 27, 2025 12:59
Copy link

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@mymeiyi mymeiyi force-pushed the add-mow-compact-key branch from 282de53 to 647eef0 Compare March 3, 2025 06:42
Copy link

github-actions bot commented Mar 3, 2025

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Copy link

github-actions bot commented Mar 3, 2025

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 327:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 336:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 341:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 360:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 362:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 364:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 372:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 373:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 377:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 388:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 390:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 400:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 401:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 403:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 411:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 412:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 416:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 427:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 429:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 496:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 497:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 498:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 507:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 509:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 511:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 515:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 530:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 531:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 532:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 543:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 544:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 549:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 554:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 560:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 598:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 600:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 612:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 617:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 618:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 619:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 758:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 759:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 760:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 762:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
  https://www.shellcheck.net/wiki/SC2242 -- Can only exit with status 0-255. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -186,7 +186,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -384,7 +384,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -423,7 +423,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -449,12 +449,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -616,9 +616,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -656,97 +656,97 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@mymeiyi mymeiyi force-pushed the add-mow-compact-key branch from 3c00373 to 66023c9 Compare March 5, 2025 10:43
Copy link

github-actions bot commented Mar 5, 2025

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In bin/flight_record_fe.sh line 47:
FE_PID=$(${JAVA_HOME}/bin/jps | grep DorisFE | awk '{print $1}')
         ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
FE_PID=$("${JAVA_HOME}"/bin/jps | grep DorisFE | awk '{print $1}')


In bin/profile_fe.sh line 47:
FE_PID=$(${JAVA_HOME}/bin/jps | grep DorisFE | awk '{print $1}')
         ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
FE_PID=$("${JAVA_HOME}"/bin/jps | grep DorisFE | awk '{print $1}')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/docker-compose/ranger/ranger-admin/ranger-entrypoint.sh line 24:
cd $RANGER_HOME
   ^----------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.
   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cd "${RANGER_HOME}"


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 16:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 19:
if [ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]; then
   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
           ^------------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.

Did you mean: 
if [[ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]]; then


In docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh line 15:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 330:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 339:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 344:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 363:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 365:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 367:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 375:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 376:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 380:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 391:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 393:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 403:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 404:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 406:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 414:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 415:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 419:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 430:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 432:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 499:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 500:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 501:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 510:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 512:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 514:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 518:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 533:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 534:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 535:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 546:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 547:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 552:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 557:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 563:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 601:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 603:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 615:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 622:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 623:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 624:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 779:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 780:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 781:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 783:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC1128 -- The shebang must be on the first ...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh
--- docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -187,7 +187,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -387,7 +387,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -426,7 +426,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -452,12 +452,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -619,9 +619,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -672,102 +672,102 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
 if [[ "${RUN_RANGER}" -eq 1 ]]; then
-    start_ranger > start_ranger.log 2>&1 &
+    start_ranger >start_ranger.log 2>&1 &
     pids["ranger"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


@mymeiyi mymeiyi force-pushed the add-mow-compact-key branch from 66023c9 to d803eda Compare March 7, 2025 07:12
Copy link

github-actions bot commented Mar 7, 2025

sh-checker report

To get the full details, please check in the job output.

shellcheck errors

'shellcheck ' returned error 1 finding the following syntactical issues:

----------

In bin/flight_record_fe.sh line 47:
FE_PID=$(${JAVA_HOME}/bin/jps | grep DorisFE | awk '{print $1}')
         ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
FE_PID=$("${JAVA_HOME}"/bin/jps | grep DorisFE | awk '{print $1}')


In bin/profile_fe.sh line 47:
FE_PID=$(${JAVA_HOME}/bin/jps | grep DorisFE | awk '{print $1}')
         ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
FE_PID=$("${JAVA_HOME}"/bin/jps | grep DorisFE | awk '{print $1}')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 22:
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
                                                               ^-----------^ SC2006 (style): Use $(...) notation instead of legacy backticks `...`.

Did you mean: 
export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}


In docker/thirdparties/docker-compose/common/hive-configure.sh line 29:
  local entry="<property><name>$name</name><value>${value}</value></property>"
                               ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local entry="<property><name>${name}</name><value>${value}</value></property>"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 30:
  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
        ^----------^ SC2155 (warning): Declare and assign separately to avoid masking return values.
                            ^----^ SC2086 (info): Double quote to prevent globbing and word splitting.
                            ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  local escapedEntry=$(echo "${entry}" | sed 's/\//\\\//g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 31:
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
                                                        ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" "${path}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 42:
    echo "Configuring $module"
                      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "Configuring ${module}"


In docker/thirdparties/docker-compose/common/hive-configure.sh line 43:
    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                                                                                            ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix="${envPrefix}"); do 


In docker/thirdparties/docker-compose/common/hive-configure.sh line 44:
        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                   ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        name=$(echo "${c}" | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')


In docker/thirdparties/docker-compose/common/hive-configure.sh line 47:
        echo " - Setting $name=$  "
                         ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo " - Setting ${name}=$  "


In docker/thirdparties/docker-compose/common/hive-configure.sh line 48:
        addProperty $path $name "$value"
                    ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                    ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^---^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        addProperty "${path}" "${name}" "${value}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 26:
    // clear output file
    ^-- SC1127 (error): Was this intended as a comment? Use # in sh.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 27:
    echo "" > "$output_file"
               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    echo "" > "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 31:
        if [ -n "$type_value" ]; then
           ^------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -n "${type_value}" ]]; then


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 32:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                         ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                               ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_type\": \"${type_value}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 34:
            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
                                             ^---------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "{\"index\": {\"_index\": \"${index_name}\", \"_id\": \"${id_prefix}${id}\"}}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 36:
        echo "$line"  >> "$output_file"
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "${line}"  >> "${output_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 38:
    done < "$data_file"
            ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    done < "${data_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 79:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 80:
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_5_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 119:
generate_bulk_request "composite_type_array" "doc" "item_" "$array_data_file" "$bulk_request_file"
                                                            ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                               ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 120:
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_6_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 126:
curl "http://${ES_7_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_7_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 164:
generate_bulk_request "composite_type_array" "_doc" "item_" "$array_data_file" "$bulk_request_file"
                                                             ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "_doc" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 165:
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_7_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 171:
curl "http://${ES_8_HOST}:9200/test1" -H "Content-Type:application/json" -X PUT -d "@/mnt/scripts/index/es7_test1.json"
             ^----------^ SC2154 (warning): ES_8_HOST is referenced but not assigned.


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 207:
generate_bulk_request "composite_type_array" "" "item_" "$array_data_file" "$bulk_request_file"
                                                         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                            ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
generate_bulk_request "composite_type_array" "" "item_" "${array_data_file}" "${bulk_request_file}"


In docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh line 208:
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@$bulk_request_file" -H "Content-Type: application/json"
                                                              ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
curl -X POST "http://${ES_8_HOST}:9200/_bulk" --data-binary "@${bulk_request_file}" -H "Content-Type: application/json"


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh line 11:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f test_data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 9:
    cd ${CUR_DIR}/ && rm -f data.tar.gz \
       ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    cd "${CUR_DIR}"/ && rm -f data.tar.gz \


In docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh line 12:
    cd -
    ^--^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' in case cd fails.

Did you mean: 
    cd - || exit


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 33:
while ! $(nc -z localhost "${HMS_PORT:-9083}"); do
        ^-- SC2091 (warning): Remove surrounding $() to avoid executing output (or use eval if intentional).


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 42:
find "${DATA_DIR}" -type f -name "run.sh" -print0 | xargs -0 -n 1 -P "${parallel}" -I {} bash -ec '
                                                                                                  ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh line 148:
ls /mnt/scripts/create_preinstalled_scripts/*.hql | xargs -n 1 -P "${parallel}" -I {} bash -ec '
^-- SC2011 (warning): Use 'find .. -print0 | xargs -0 ..' or 'find .. -exec .. +' to allow non-alphanumeric filenames.
                                                                                               ^-- SC2016 (info): Expressions don't expand in single quotes, use double quotes for that.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 25:
if [ -z "${HADOOP_HOME}" ]; then
   ^---------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HADOOP_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 29:
if [ -z "${HIVE_HOME}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_HOME}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 35:
HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
                   ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                    ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                          ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                          ^--^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HUDI_HIVE_UBER_JAR=$(ls -c "${DIR}"/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 37:
if [ -z "$HADOOP_CONF_DIR" ]; then
   ^-----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^--------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ -z "${HADOOP_CONF_DIR}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 43:
HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_EXEC=$(ls "${HIVE_HOME}"/lib/hive-exec-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 44:
HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_SERVICE=$(ls "${HIVE_HOME}"/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 45:
HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
               ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
                ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_METASTORE=$(ls "${HIVE_HOME}"/lib/hive-metastore-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 46:
HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
          ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
           ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
              ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 47:
if [ -z "${HIVE_JDBC}" ]; then
   ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ -z "${HIVE_JDBC}" ]]; then


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 48:
  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
            ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
             ^-- SC2010 (warning): Don't use ls | grep. Use a glob or a for loop with a condition to allow non-alphanumeric filenames.
                ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
  HIVE_JDBC=$(ls "${HIVE_HOME}"/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 50:
HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
             ^-- SC2006 (style): Use $(...) notation instead of legacy backticks `...`.
              ^-- SC2012 (info): Use find instead of ls to better handle non-alphanumeric filenames.
                 ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
HIVE_JACKSON=$(ls "${HIVE_HOME}"/lib/jackson-*.jar | tr '\n' ':')


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 51:
HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
          ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                          ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                        ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                   ^--------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                              ^-----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
HIVE_JARS=${HIVE_METASTORE}:${HIVE_SERVICE}:${HIVE_EXEC}:${HIVE_JDBC}:${HIVE_JACKSON}


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 53:
HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
                 ^-- SC2125 (warning): Brace expansions and globs are literal in assignments. Quote it or use an array.


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 55:
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:$HUDI_HIVE_UBER_JAR org.apache.hudi.hive.HiveSyncTool $@"
                                                                        ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                                                                                              ^-- SC2145 (error): Argument mixes string and array. Use * or separate argument.

Did you mean: 
echo "Running Command : java -cp ${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR}:${HUDI_HIVE_UBER_JAR} org.apache.hudi.hive.HiveSyncTool $@"


In docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh line 56:
java -cp $HUDI_HIVE_UBER_JAR:${HADOOP_HIVE_JARS}:${HADOOP_CONF_DIR} org.apache.hudi.hive.HiveSyncTool "$@"
         ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
         ^-----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                             ^-----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                 ^----------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
java -cp "${HUDI_HIVE_UBER_JAR}":"${HADOOP_HIVE_JARS}":"${HADOOP_CONF_DIR}" org.apache.hudi.hive.HiveSyncTool "$@"


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_1.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 20:
cp /var/scripts/config/spark-defaults.conf $SPARK_CONF_DIR/.
                                           ^-------------^ SC2154 (warning): SPARK_CONF_DIR is referenced but not assigned.
                                           ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                           ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/spark-defaults.conf "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh line 21:
cp /var/scripts/config/log4j2.properties $SPARK_CONF_DIR/.
                                         ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                         ^-------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cp /var/scripts/config/log4j2.properties "${SPARK_CONF_DIR}"/.


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 19:
source /usr/local/common/hive-configure.sh
       ^-- SC1091 (info): Not following: /usr/local/common/hive-configure.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 20:
source /usr/local/common/event-hook.sh
       ^-----------------------------^ SC1091 (info): Not following: /usr/local/common/event-hook.sh: openBinaryFile: does not exist (No such file or directory)


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 34:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 36:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 57:
if [ $i -eq 60 ]; then
   ^-----------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
     ^-- SC2086 (info): Double quote to prevent globbing and word splitting.
     ^-- SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${i}" -eq 60 ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 64:
if [ "$1" == "1" ]; then
   ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
if [[ "$1" == "1" ]]; then


In docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh line 66:
elif [ "$1" == "2" ]; then
     ^-------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
elif [[ "$1" == "2" ]]; then


In docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh line 34:
if [ "$FAILED" == "" ]; then
   ^-----------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
      ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
if [[ "${FAILED}" == "" ]]; then


In docker/thirdparties/docker-compose/ranger/ranger-admin/ranger-entrypoint.sh line 24:
cd $RANGER_HOME
   ^----------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.
   ^----------^ SC2086 (info): Double quote to prevent globbing and word splitting.
   ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
cd "${RANGER_HOME}"


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 16:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh line 19:
if [ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]; then
   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
           ^------------^ SC2154 (warning): RANGER_HOME is referenced but not assigned.

Did you mean: 
if [[ ! -d "${RANGER_HOME}/ews/webapp/WEB-INF/classes/ranger-plugins/doris" ]]; then


In docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh line 15:
#!/bin/bash
^-- SC1128 (error): The shebang must be on the first line. Delete blanks and move comments.


In docker/thirdparties/run-thirdparties-docker.sh line 117:
    echo ${COMPONENTS}
         ^-----------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
    echo "${COMPONENTS}"


In docker/thirdparties/run-thirdparties-docker.sh line 150:
RUN_OCENABASE=0
^-----------^ SC2034 (warning): RUN_OCENABASE appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 330:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 339:
        local backup_dir=/home/work/pipline/backup_center
              ^--------^ SC2034 (warning): backup_dir appears unused. Verify use (or export if used externally).


In docker/thirdparties/run-thirdparties-docker.sh line 344:
            echo "docker exec "${container_id}" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"
                               ^-------------^ SC2027 (warning): The surrounding quotes actually unquote this. Remove or escape them.
                               ^-------------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
            echo "docker exec ""${container_id}"" bash -c echo '/opt/bitnami/kafka/bin/kafka-topics.sh --create --bootstrap-server '${ip_host}:19193' --topic '${topic}'"


In docker/thirdparties/run-thirdparties-docker.sh line 363:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 365:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 367:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 375:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 376:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 380:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 391:
    sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 393:
        sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 403:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 404:
    if [ "_${IP_HOST}" == "_" ]; then
       ^----------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
    if [[ "_${IP_HOST}" == "_" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 406:
        exit -1
             ^-- SC2242 (error): Can only exit with status 0-255. Other data should be written to stdout/stderr.


In docker/thirdparties/run-thirdparties-docker.sh line 414:
        if [ -f "$file" ]; then
           ^------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -f "${file}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 415:
            echo "Processing $file"
                             ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Processing ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 419:
            echo "File not found: $file"
                                  ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "File not found: ${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 430:
    sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
                           ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
    sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down


In docker/thirdparties/run-thirdparties-docker.sh line 432:
        sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
                               ^--------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        sudo docker compose -p "${CONTAINER_UID}"hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait


In docker/thirdparties/run-thirdparties-docker.sh line 499:
        cp "${trino_docker}/$file.tpl" "${trino_docker}/$file"
                            ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                        ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        cp "${trino_docker}/${file}.tpl" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 500:
        if [[ $file != "hive.properties" ]]; then
              ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ ${file} != "hive.properties" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 501:
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/$file"
                                                                   ^---^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            sed -i "s/doris--/${CONTAINER_UID}/g" "${trino_docker}/${file}"


In docker/thirdparties/run-thirdparties-docker.sh line 510:
        sudo echo "127.0.0.1 ${NAMENODE_CONTAINER_ID}" >>/etc/hosts
                                                       ^-- SC2024 (warning): sudo doesn't affect redirects. Use .. | sudo tee -a file


In docker/thirdparties/run-thirdparties-docker.sh line 512:
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' ${HIVE_METASTORE_CONTAINER_ID})
                                                                                                               ^----------------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        hive_metastore_ip=$(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' "${HIVE_METASTORE_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 514:
        if [ -z "$hive_metastore_ip" ]; then
           ^-------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        if [[ -z "${hive_metastore_ip}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 518:
            echo "Hive Metastore IP address is: $hive_metastore_ip"
                                                ^----------------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            echo "Hive Metastore IP address is: ${hive_metastore_ip}"


In docker/thirdparties/run-thirdparties-docker.sh line 533:
            while [ $retries -lt $max_retries ]; do
                  ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                    ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                    ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                 ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                                 ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            while [[ "${retries}" -lt "${max_retries}" ]]; do


In docker/thirdparties/run-thirdparties-docker.sh line 534:
                status=$(docker inspect --format '{{.State.Running}}' ${TRINO_CONTAINER_ID})
                                                                      ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
                status=$(docker inspect --format '{{.State.Running}}' "${TRINO_CONTAINER_ID}")


In docker/thirdparties/run-thirdparties-docker.sh line 535:
                if [ "${status}" == "${expected_status}" ]; then
                   ^-- SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.

Did you mean: 
                if [[ "${status}" == "${expected_status}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 546:
            if [ $retries -eq $max_retries ]; then
               ^---------------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                 ^------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                              ^----------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
                              ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
            if [[ "${retries}" -eq "${max_retries}" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 547:
                echo "${operation} operation failed to complete after $max_retries attempts."
                                                                      ^----------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
                echo "${operation} operation failed to complete after ${max_retries} attempts."


In docker/thirdparties/run-thirdparties-docker.sh line 552:
        docker stop ${TRINO_CONTAINER_ID}
                    ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker stop "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 557:
        docker start ${TRINO_CONTAINER_ID}
                     ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker start "${TRINO_CONTAINER_ID}"


In docker/thirdparties/run-thirdparties-docker.sh line 563:
        docker exec -it ${TRINO_CONTAINER_ID} /bin/bash -c 'trino -f /scripts/create_trino_table.sql'
                        ^-------------------^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.

Did you mean: 
        docker exec -it "${TRINO_CONTAINER_ID}" /bin/bash -c 'trino -f /scripts/create_trino_table.sql'


In docker/thirdparties/run-thirdparties-docker.sh line 601:
        mv *.tbl ../lakesoul/test_files/tpch/data
           ^-- SC2035 (info): Use ./*glob* or -- *glob* so names with dashes won't become options.


In docker/thirdparties/run-thirdparties-docker.sh line 603:
        export TPCH_DATA=$(realpath lakesoul/test_files/tpch/data)
               ^-------^ SC2155 (warning): Declare and assign separately to avoid masking return values.


In docker/thirdparties/run-thirdparties-docker.sh line 615:
    IP_HOST=$(ifconfig "${eth_name}" | grep inet | grep -v 127.0.0.1 | grep -v inet6 | awk '{print $2}' | tr -d "addr:" | head -n 1)
                                                                                                                ^-----^ SC2020 (info): tr replaces sets of chars, not words (mentioned due to duplicates).


In docker/thirdparties/run-thirdparties-docker.sh line 620:
        . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
          ^-- SC1090 (warning): ShellCheck can't follow non-constant source. Use a directive to specify location.
                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        . "${ROOT}"/docker-compose/kerberos/kerberos"${i}"_settings.env


In docker/thirdparties/run-thirdparties-docker.sh line 621:
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
                                                                                                                       ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-"${i}".env


In docker/thirdparties/run-thirdparties-docker.sh line 622:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/my.cnf


In docker/thirdparties/run-thirdparties-docker.sh line 623:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                    ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/kdc.conf


In docker/thirdparties/run-thirdparties-docker.sh line 624:
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
                                                                 ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.
                                                                                                                                     ^--^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos"${i}"/krb5.conf


In docker/thirdparties/run-thirdparties-docker.sh line 779:
    wait "${pids[$compose]}" || status=$?
                 ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    wait "${pids[${compose}]}" || status=$?


In docker/thirdparties/run-thirdparties-docker.sh line 780:
    if [ $status -ne 0 ] && [ $compose != "db2" ]; then
       ^---------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
         ^-----^ SC2248 (style): Prefer double quoting even when variables don't contain special characters.
         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                            ^-------------------^ SC2292 (style): Prefer [[ ]] over [ ] for tests in Bash/Ksh.
                              ^------^ SC2086 (info): Double quote to prevent globbing and word splitting.
                              ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
    if [[ "${status}" -ne 0 ]] && [[ "${compose}" != "db2" ]]; then


In docker/thirdparties/run-thirdparties-docker.sh line 781:
        echo "docker $compose started failed with status $status"
                     ^------^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.
                                                         ^-----^ SC2250 (style): Prefer putting braces around variable references even when not strictly required.

Did you mean: 
        echo "docker ${compose} started failed with status ${status}"


In docker/thirdparties/run-thirdparties-docker.sh line 783:
        cat start_${compose}.log || true
                  ^--------^ SC2086 (info): Double quote to prevent globbing and word splitting.

Did you mean: 
        cat start_"${compose}".log || true

For more information:
  https://www.shellcheck.net/wiki/SC1127 -- Was this intended as a comment? U...
  https://www.shellcheck.net/wiki/SC1128 -- The shebang must be on the first ...
  https://www.shellcheck.net/wiki/SC2145 -- Argument mixes string and array. ...
----------

You can address the above issues in one of three ways:
1. Manually correct the issue in the offending shell script;
2. Disable specific issues by adding the comment:
  # shellcheck disable=NNNN
above the line that contains the issue, where NNNN is the error code;
3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.



shfmt errors

'shfmt ' returned error 1 finding the following formatting issues:

----------
--- docker/thirdparties/docker-compose/common/event-hook.sh.orig
+++ docker/thirdparties/docker-compose/common/event-hook.sh
--- docker/thirdparties/docker-compose/common/hive-configure.sh.orig
+++ docker/thirdparties/docker-compose/common/hive-configure.sh
@@ -19,16 +19,16 @@
 # Referenced from [docker-hive](https://github.com/big-data-europe/docker-hive)
 
 # Set some sensible defaults
-export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://`hostname -f`:8020}
+export CORE_CONF_fs_defaultFS=${CORE_CONF_fs_defaultFS:-hdfs://$(hostname -f):8020}
 
 function addProperty() {
-  local path=$1
-  local name=$2
-  local value=$3
+    local path=$1
+    local name=$2
+    local value=$3
 
-  local entry="<property><name>$name</name><value>${value}</value></property>"
-  local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
-  sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
+    local entry="<property><name>$name</name><value>${value}</value></property>"
+    local escapedEntry=$(echo $entry | sed 's/\//\\\//g')
+    sed -i "/<\/configuration>/ s/.*/${escapedEntry}\n&/" $path
 }
 
 function configure() {
@@ -38,10 +38,10 @@
 
     local var
     local value
-    
+
     echo "Configuring $module"
-    for c in `printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix`; do 
-        name=`echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g'`
+    for c in $(printenv | perl -sne 'print "$1 " if m/^${envPrefix}_(.+?)=.*/' -- -envPrefix=$envPrefix); do
+        name=$(echo ${c} | perl -pe 's/___/-/g; s/__/_/g; s/_/./g')
         var="${envPrefix}_${c}"
         value=${!var}
         echo " - Setting $name=$  "
--- docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh.orig
+++ docker/thirdparties/docker-compose/elasticsearch/scripts/es_init.sh
@@ -24,18 +24,18 @@
     local output_file=$5
 
     // clear output file
-    echo "" > "$output_file"
+    echo "" >"$output_file"
 
     local id=1
     while IFS= read -r line; do
         if [ -n "$type_value" ]; then
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_type\": \"$type_value\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         else
-            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}"  >> "$output_file"
+            echo "{\"index\": {\"_index\": \"$index_name\", \"_id\": \"${id_prefix}${id}\"}}" >>"$output_file"
         fi
-        echo "$line"  >> "$output_file"
+        echo "$line" >>"$output_file"
         id=$((id + 1))
-    done < "$data_file"
+    done <"$data_file"
 }
 
 array_data_file="/mnt/scripts/data/composite_type_array_bulk.json"
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_csv/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/datev2_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_config_test/run.sh
@@ -11,4 +11,3 @@
 hive -f "${CUR_DIR}"/create_table.hql
 
 hadoop fs -rm -r /user/doris/suites/default/hive_ignore_absent_partitions_table/country=India
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_all_types/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_array_delimiter/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_textfile_nestedarray/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_orc/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/hive_upper_case_parquet/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/logs1_parquet/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/logs1_parquet/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/one_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/orc_predicate/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/orc_predicate_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lz4_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/parquet_lzo_compression/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/partition_manual_remove/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_complex_types/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_complex_types/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_compress_partitioned/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_compress_partitioned/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_csv_format_error/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_date_string_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/test_wide_table/run.sh
@@ -5,9 +5,9 @@
 
 if [[ ! -d "${CUR_DIR}/data" ]]; then
     echo "${CUR_DIR}/data does not exist"
-    cd "${CUR_DIR}" && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd "${CUR_DIR}" && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/multi_catalog/test_wide_table/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/data exist, continue !"
@@ -19,4 +19,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/multi_catalog/two_partition/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/crdmm_data/run.sh
@@ -10,5 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/create_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/regression/serde_prop/run.sh
@@ -5,5 +5,3 @@
 
 # create table
 hive -f "${CUR_DIR}"/some_serde_table.hql
-
-
--- docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/statistics/statistics/run.sh
@@ -10,4 +10,3 @@
 
 # create table
 hive -f "${CUR_DIR}/create_table.hql"
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tpch_1000_parquet/part/run.sh
@@ -19,4 +19,3 @@
 
 # # create table
 # hive -f "${CUR_DIR}"/create_table.hql
-
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_hdfs_tvf_compression/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/test_data" ]]; then
     echo "${CUR_DIR}/test_data does not exist"
-    cd ${CUR_DIR}/ && rm -f test_data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz \
-    && tar xzf test_data.tar.gz
+    cd ${CUR_DIR}/ && rm -f test_data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_hdfs_tvf_compression/test_data.tar.gz &&
+        tar xzf test_data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/test_data exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/data/tvf/test_tvf/run.sh
@@ -6,9 +6,9 @@
 
 if [[ ! -d "${CUR_DIR}/tvf" ]]; then
     echo "${CUR_DIR}/tvf does not exist"
-    cd ${CUR_DIR}/ && rm -f data.tar.gz \
-    && curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz \
-    && tar xzf data.tar.gz
+    cd ${CUR_DIR}/ && rm -f data.tar.gz &&
+        curl -O https://s3BucketName.s3Endpoint/regression/datalake/pipeline_data/test_tvf/data.tar.gz &&
+        tar xzf data.tar.gz
     cd -
 else
     echo "${CUR_DIR}/tvf exist, continue !"
--- docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh.orig
+++ docker/thirdparties/docker-compose/hive/scripts/hive-metastore.sh
@@ -97,7 +97,6 @@
 hadoop_put_pids=()
 hadoop fs -mkdir -p /user/doris/
 
-
 ## put tpch1
 if [[ -z "$(ls /mnt/scripts/tpch1.db)" ]]; then
     echo "tpch1.db does not exist"
@@ -126,7 +125,6 @@
 hadoop fs -copyFromLocal -f /mnt/scripts/preinstalled_data /user/doris/ &
 hadoop_put_pids+=($!)
 
-
 # wait put finish
 set +e
 wait "${hadoop_put_pids[@]}"
--- docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/run_sync_tool.sh
@@ -18,36 +18,36 @@
 # under the License.
 
 function error_exit {
-    echo "$1" >&2   ## Send message to stderr. Exclude >&2 if you don't want it that way.
-    exit "${2:-1}"  ## Return a code specified by $2 or 1 by default.
+    echo "$1" >&2  ## Send message to stderr. Exclude >&2 if you don't want it that way.
+    exit "${2:-1}" ## Return a code specified by $2 or 1 by default.
 }
 
 if [ -z "${HADOOP_HOME}" ]; then
-  error_exit "Please make sure the environment variable HADOOP_HOME is setup"
+    error_exit "Please make sure the environment variable HADOOP_HOME is setup"
 fi
 
 if [ -z "${HIVE_HOME}" ]; then
-  error_exit "Please make sure the environment variable HIVE_HOME is setup"
+    error_exit "Please make sure the environment variable HIVE_HOME is setup"
 fi
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
 #Ensure we pick the right jar even for hive11 builds
-HUDI_HIVE_UBER_JAR=`ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1`
+HUDI_HIVE_UBER_JAR=$(ls -c $DIR/./hudi_docker_compose_attached_file/jar/hoodie-hive-sync-bundle.jar | grep -v source | head -1)
 
 if [ -z "$HADOOP_CONF_DIR" ]; then
-  echo "setting hadoop conf dir"
-  HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
+    echo "setting hadoop conf dir"
+    HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"
 fi
 
 ## Include only specific packages from HIVE_HOME/lib to avoid version mismatches
-HIVE_EXEC=`ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':'`
-HIVE_SERVICE=`ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':'`
-HIVE_METASTORE=`ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':'`
-HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':'`
+HIVE_EXEC=$(ls ${HIVE_HOME}/lib/hive-exec-*.jar | tr '\n' ':')
+HIVE_SERVICE=$(ls ${HIVE_HOME}/lib/hive-service-*.jar | grep -v rpc | tr '\n' ':')
+HIVE_METASTORE=$(ls ${HIVE_HOME}/lib/hive-metastore-*.jar | tr '\n' ':')
+HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | tr '\n' ':')
 if [ -z "${HIVE_JDBC}" ]; then
-  HIVE_JDBC=`ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':'`
+    HIVE_JDBC=$(ls ${HIVE_HOME}/lib/hive-jdbc-*.jar | grep -v handler | tr '\n' ':')
 fi
-HIVE_JACKSON=`ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':'`
+HIVE_JACKSON=$(ls ${HIVE_HOME}/lib/jackson-*.jar | tr '\n' ':')
 HIVE_JARS=$HIVE_METASTORE:$HIVE_SERVICE:$HIVE_EXEC:$HIVE_JDBC:$HIVE_JACKSON
 
 HADOOP_HIVE_JARS=${HIVE_JARS}:${HADOOP_HOME}/share/hadoop/common/*:${HADOOP_HOME}/share/hadoop/mapreduce/*:${HADOOP_HOME}/share/hadoop/hdfs/*:${HADOOP_HOME}/share/hadoop/common/lib/*:${HADOOP_HOME}/share/hadoop/hdfs/lib/*
--- docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh.orig
+++ docker/thirdparties/docker-compose/hudi/scripts/setup_demo_container_adhoc_2.sh
@@ -36,42 +36,42 @@
 
 echo "Start synchronizing the stock_ticks_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_cow \
-  --database default \
-  --table stock_ticks_cow \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_cow \
+    --database default \
+    --table stock_ticks_cow \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the stock_ticks_mor table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by date \
-  --base-path /user/hive/warehouse/stock_ticks_mor \
-  --database default \
-  --table stock_ticks_mor \
-  --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by date \
+    --base-path /user/hive/warehouse/stock_ticks_mor \
+    --database default \
+    --table stock_ticks_mor \
+    --partition-value-extractor org.apache.hudi.hive.SlashEncodedDayPartitionValueExtractor
 
 echo "Start synchronizing the hudi_cow_pt_tbl table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --partitioned-by dt \
-  --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
-  --database default \
-  --table hudi_cow_pt_tbl \
-  --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --partitioned-by dt \
+    --base-path /user/hive/warehouse/hudi_cow_pt_tbl \
+    --database default \
+    --table hudi_cow_pt_tbl \
+    --partition-value-extractor org.apache.hudi.hive.HiveStylePartitionValueExtractor
 
 echo "Start synchronizing the hudi_non_part_cow table"
 /var/scripts/run_sync_tool.sh \
-  --jdbc-url jdbc:hive2://hiveserver:10000 \
-  --user hive \
-  --pass hive \
-  --base-path /user/hive/warehouse/hudi_non_part_cow \
-  --database default \
-  --table hudi_non_part_cow \
+    --jdbc-url jdbc:hive2://hiveserver:10000 \
+    --user hive \
+    --pass hive \
+    --base-path /user/hive/warehouse/hudi_non_part_cow \
+    --database default \
+    --table hudi_non_part_cow
--- docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh.orig
+++ docker/thirdparties/docker-compose/iceberg/tools/save_docker.sh
--- docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/entrypoint-hive-master.sh
@@ -69,6 +69,6 @@
     echo "Invalid index parameter. Exiting."
     exit 1
 fi
-hive  -f /usr/local/sql/create_kerberos_hive_table.sql
+hive -f /usr/local/sql/create_kerberos_hive_table.sql
 
 exec_success_hook
--- docker/thirdparties/docker-compose/kerberos/health-checks/health.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/health.sh
@@ -29,6 +29,6 @@
 
 if test -d "${HEALTH_D}"; then
     for health_script in "${HEALTH_D}"/*; do
-        "${health_script}" &>> /var/log/container-health.log || exit 1
+        "${health_script}" &>>/var/log/container-health.log || exit 1
     done
 fi
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check-2.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/hive-health-check.sh
--- docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh.orig
+++ docker/thirdparties/docker-compose/kerberos/health-checks/supervisorctl-check.sh
@@ -32,9 +32,9 @@
 FAILED=$(supervisorctl status | grep -v RUNNING || true)
 
 if [ "$FAILED" == "" ]; then
-  echo "All services are running"
-  exit 0
+    echo "All services are running"
+    exit 0
 else
-  echo "Some of the services are failing: ${FAILED}"
-  exit 1
+    echo "Some of the services are failing: ${FAILED}"
+    exit 1
 fi
--- docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_ranger_plugins.sh
--- docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh.orig
+++ docker/thirdparties/docker-compose/ranger/script/install_doris_service_def.sh
--- docker/thirdparties/run-thirdparties-docker.sh.orig
+++ docker/thirdparties/run-thirdparties-docker.sh
@@ -187,7 +187,7 @@
         RUN_MARIADB=1
     elif [[ "${element}"x == "db2"x ]]; then
         RUN_DB2=1
-    elif [[ "${element}"x == "oceanbase"x ]];then
+    elif [[ "${element}"x == "oceanbase"x ]]; then
         RUN_OCEANBASE=1
     elif [[ "${element}"x == "lakesoul"x ]]; then
         RUN_LAKESOUL=1
@@ -387,7 +387,7 @@
     . "${ROOT}"/docker-compose/hive/hive-2x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-2x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-2x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-2x.env
     sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive2 -f "${ROOT}"/docker-compose/hive/hive-2x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-2x.env up --build --remove-orphans -d --wait
@@ -426,7 +426,7 @@
     . "${ROOT}"/docker-compose/hive/hive-3x_settings.env
     envsubst <"${ROOT}"/docker-compose/hive/hive-3x.yaml.tpl >"${ROOT}"/docker-compose/hive/hive-3x.yaml
     envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
-    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >> "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
+    envsubst <"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env.tpl >>"${ROOT}"/docker-compose/hive/hadoop-hive-3x.env
     sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env down
     if [[ "${STOP}" -ne 1 ]]; then
         sudo docker compose -p ${CONTAINER_UID}hive3 -f "${ROOT}"/docker-compose/hive/hive-3x.yaml --env-file "${ROOT}"/docker-compose/hive/hadoop-hive-3x.env up --build --remove-orphans -d --wait
@@ -452,12 +452,12 @@
     if [[ "${STOP}" -ne 1 ]]; then
         if [[ ! -d "${ICEBERG_DIR}/data" ]]; then
             echo "${ICEBERG_DIR}/data does not exist"
-            cd "${ICEBERG_DIR}" \
-            && rm -f iceberg_data.zip \
-            && wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip \
-            && sudo unzip iceberg_data.zip \
-            && sudo mv iceberg_data data \
-            && sudo rm -rf iceberg_data.zip
+            cd "${ICEBERG_DIR}" &&
+                rm -f iceberg_data.zip &&
+                wget -P "${ROOT}"/docker-compose/iceberg https://"${s3BucketName}.${s3Endpoint}"/regression/datalake/pipeline_data/iceberg_data.zip &&
+                sudo unzip iceberg_data.zip &&
+                sudo mv iceberg_data data &&
+                sudo rm -rf iceberg_data.zip
             cd -
         else
             echo "${ICEBERG_DIR}/data exist, continue !"
@@ -619,9 +619,9 @@
     for i in {1..2}; do
         . "${ROOT}"/docker-compose/kerberos/kerberos${i}_settings.env
         envsubst <"${ROOT}"/docker-compose/kerberos/hadoop-hive.env.tpl >"${ROOT}"/docker-compose/kerberos/hadoop-hive-${i}.env
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
-        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl > "${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/my.cnf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/my.cnf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/kdc.conf
+        envsubst <"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf.tpl >"${ROOT}"/docker-compose/kerberos/conf/kerberos${i}/krb5.conf
     done
     sudo chmod a+w /etc/hosts
     sudo sed -i "1i${IP_HOST} hadoop-master" /etc/hosts
@@ -672,102 +672,102 @@
 declare -A pids
 
 if [[ "${RUN_ES}" -eq 1 ]]; then
-    start_es > start_es.log  2>&1 &
+    start_es >start_es.log 2>&1 &
     pids["es"]=$!
 fi
 
 if [[ "${RUN_MYSQL}" -eq 1 ]]; then
-    start_mysql > start_mysql.log 2>&1 &
+    start_mysql >start_mysql.log 2>&1 &
     pids["mysql"]=$!
 fi
 
 if [[ "${RUN_PG}" -eq 1 ]]; then
-    start_pg > start_pg.log 2>&1 &
+    start_pg >start_pg.log 2>&1 &
     pids["pg"]=$!
 fi
 
 if [[ "${RUN_ORACLE}" -eq 1 ]]; then
-    start_oracle > start_oracle.log 2>&1 &
+    start_oracle >start_oracle.log 2>&1 &
     pids["oracle"]=$!
 fi
 
 if [[ "${RUN_DB2}" -eq 1 ]]; then
-    start_db2 > start_db2.log 2>&1 &
+    start_db2 >start_db2.log 2>&1 &
     pids["db2"]=$!
 fi
 
 if [[ "${RUN_OCEANBASE}" -eq 1 ]]; then
-    start_oceanbase > start_oceanbase.log 2>&1 &
+    start_oceanbase >start_oceanbase.log 2>&1 &
     pids["oceanbase"]=$!
 fi
 
 if [[ "${RUN_SQLSERVER}" -eq 1 ]]; then
-    start_sqlserver > start_sqlserver.log 2>&1 &
+    start_sqlserver >start_sqlserver.log 2>&1 &
     pids["sqlserver"]=$!
 fi
 
 if [[ "${RUN_CLICKHOUSE}" -eq 1 ]]; then
-    start_clickhouse > start_clickhouse.log 2>&1 &
+    start_clickhouse >start_clickhouse.log 2>&1 &
     pids["clickhouse"]=$!
 fi
 
 if [[ "${RUN_KAFKA}" -eq 1 ]]; then
-    start_kafka > start_kafka.log 2>&1 &
+    start_kafka >start_kafka.log 2>&1 &
     pids["kafka"]=$!
 fi
 
 if [[ "${RUN_HIVE2}" -eq 1 ]]; then
-    start_hive2 > start_hive2.log 2>&1 &
+    start_hive2 >start_hive2.log 2>&1 &
     pids["hive2"]=$!
 fi
 
 if [[ "${RUN_HIVE3}" -eq 1 ]]; then
-    start_hive3 > start_hive3.log 2>&1 &
+    start_hive3 >start_hive3.log 2>&1 &
     pids["hive3"]=$!
 fi
 
 if [[ "${RUN_SPARK}" -eq 1 ]]; then
-    start_spark > start_spark.log 2>&1 &
+    start_spark >start_spark.log 2>&1 &
     pids["spark"]=$!
 fi
 
 if [[ "${RUN_ICEBERG}" -eq 1 ]]; then
-    start_iceberg > start_icerberg.log 2>&1 &
+    start_iceberg >start_icerberg.log 2>&1 &
     pids["iceberg"]=$!
 fi
 
 if [[ "${RUN_HUDI}" -eq 1 ]]; then
-    start_hudi > start_hudi.log 2>&1 &
+    start_hudi >start_hudi.log 2>&1 &
     pids["hudi"]=$!
 fi
 
 if [[ "${RUN_TRINO}" -eq 1 ]]; then
-    start_trino > start_trino.log 2>&1 &
+    start_trino >start_trino.log 2>&1 &
     pids["trino"]=$!
 fi
 
 if [[ "${RUN_MARIADB}" -eq 1 ]]; then
-    start_mariadb > start_mariadb.log 2>&1 &
+    start_mariadb >start_mariadb.log 2>&1 &
     pids["mariadb"]=$!
 fi
 
 if [[ "${RUN_LAKESOUL}" -eq 1 ]]; then
-    start_lakesoul > start_lakesoule.log 2>&1 &
+    start_lakesoul >start_lakesoule.log 2>&1 &
     pids["lakesoul"]=$!
 fi
 
 if [[ "${RUN_MINIO}" -eq 1 ]]; then
-    start_minio > start_minio.log 2>&1 &
+    start_minio >start_minio.log 2>&1 &
     pids["minio"]=$!
 fi
 
 if [[ "${RUN_KERBEROS}" -eq 1 ]]; then
-    start_kerberos > start_kerberos.log 2>&1 &
+    start_kerberos >start_kerberos.log 2>&1 &
     pids["kerberos"]=$!
 fi
 
 if [[ "${RUN_RANGER}" -eq 1 ]]; then
-    start_ranger > start_ranger.log 2>&1 &
+    start_ranger >start_ranger.log 2>&1 &
     pids["ranger"]=$!
 fi
 
----------

You can reformat the above files to meet shfmt's requirements by typing:

  shfmt  -w filename


Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant