Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FLINK-18069][CI] Test if Java/Scaladocs builds are passing in the compile stage #12447

Closed
wants to merge 3 commits into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
65 changes: 42 additions & 23 deletions tools/ci/compile.sh
Original file line number Diff line number Diff line change
Expand Up @@ -47,35 +47,54 @@ run_mvn clean install $MAVEN_OPTS -Dflink.convergence.phase=install -Pcheck-conv

EXIT_CODE=$?

if [ $EXIT_CODE == 0 ]; then
if [ $EXIT_CODE != 0 ]; then
echo "=============================================================================="
echo "Checking scala suffixes"
echo "Compiling Flink failed."
echo "=============================================================================="
exit $EXIT_CODE
fi

${CI_DIR}/verify_scala_suffixes.sh "${PROFILE}"
EXIT_CODE=$?
else
echo "=============================================================================="
echo "Previous build failure detected, skipping scala-suffixes check."
echo "=============================================================================="
echo "============ Checking Javadocs ============"

# use the same invocation as on buildbot (https://svn.apache.org/repos/infra/infrastructure/buildbot/aegis/buildmaster/master1/projects/flink.conf)
run_mvn javadoc:aggregate -Paggregate-scaladoc -DadditionalJOption='-Xdoclint:none' \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same deal as scala, move the output into a file.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm sorry that you have to point me to all cases of the same class of problems.
I have no problem with verbose logs, that's why I don't feel the urge to spend time on controlling the logging behavior. I will address this & rebase to the latest master (to see the build passing)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the scala docs, I'm hiding all the warnings / errors and only show them if the build fails. I'm always showing the maven output.

For the Javadocs, I don't know if the errors are send to stderr.
I have the feeling that it is not worth my time producing a javadoc error to understand how the javadocs are printed (stderr vs stdout). I would rather prefer to spend my time testing the new 1.11 features.
In my opinion, it is okay to have 25000 lines of logs for a regular Flink compile.

If the compile stage passes, you usually don't check its output.
if it fails, you will see the error at the bottom of the file + a lot of helpful debug information above (which you would see anyways).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't matter whether it goes into stdout/stderr; you could just pipe everything into a file and dump the whole thing if an error happened.
Basically a simplified version of what you already did for scala, which should at most be a 1 minute fix.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I addressed the issue & rebased to latest master.

-Dmaven.javadoc.failOnError=false -Dcheckstyle.skip=true -Denforcer.skip=true \
-Dheader=someTestHeader > javadoc.out
EXIT_CODE=$?
if [ $EXIT_CODE != 0 ] ; then
echo "ERROR in Javadocs. Printing full output:"
cat javadoc.out ; rm javadoc.out
exit $EXIT_CODE
fi

if [ $EXIT_CODE == 0 ]; then
check_shaded_artifacts
EXIT_CODE=$(($EXIT_CODE+$?))
check_shaded_artifacts_s3_fs hadoop
EXIT_CODE=$(($EXIT_CODE+$?))
check_shaded_artifacts_s3_fs presto
EXIT_CODE=$(($EXIT_CODE+$?))
check_shaded_artifacts_connector_elasticsearch 5
EXIT_CODE=$(($EXIT_CODE+$?))
check_shaded_artifacts_connector_elasticsearch 6
EXIT_CODE=$(($EXIT_CODE+$?))
else
echo "=============================================================================="
echo "Previous build failure detected, skipping shaded dependency check."
echo "=============================================================================="
echo "============ Checking Scaladocs ============"

cd flink-scala
run_mvn scala:doc 2> scaladoc.out
EXIT_CODE=$?
if [ $EXIT_CODE != 0 ] ; then
echo "ERROR in Scaladocs. Printing full output:"
cat scaladoc.out ; rm scaladoc.out
exit $EXIT_CODE
fi
cd ..

echo "============ Checking scala suffixes ============"

${CI_DIR}/verify_scala_suffixes.sh "${PROFILE}" || exit $?

echo "============ Checking shaded dependencies ============"

check_shaded_artifacts
EXIT_CODE=$(($EXIT_CODE+$?))
check_shaded_artifacts_s3_fs hadoop
EXIT_CODE=$(($EXIT_CODE+$?))
check_shaded_artifacts_s3_fs presto
EXIT_CODE=$(($EXIT_CODE+$?))
check_shaded_artifacts_connector_elasticsearch 5
EXIT_CODE=$(($EXIT_CODE+$?))
check_shaded_artifacts_connector_elasticsearch 6
EXIT_CODE=$(($EXIT_CODE+$?))

exit $EXIT_CODE