-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GH-40775: [Benchmarking][Java] Fix conbench timeout v2 #40929
Conversation
@ursabot please benchmark lang=Java |
Benchmark runs are scheduled for commit b760135. Watch https://buildkite.com/apache-arrow and https://conbench.ursa.dev for updates. A comment will be posted here when the runs are complete. |
Thanks for your patience. Conbench analyzed the 0 benchmarking runs that have been run so far on PR commit b760135. None of the specified runs were found on the Conbench server. The full Conbench report has more details. |
@austin3dickey Could you please review this one? I am not sure why the report is shown as a failure, but in the dashbord it says it is passing. |
@ursabot please benchmark lang=Java |
Commit b760135 already has scheduled benchmark runs. |
Yeah, looks like the build passed but no results were uploaded to Conbench. I will take a look. |
@vibhatha I just ran another build that printed the stdout of the archery process. (Sorry it's not pretty -- all the newlines are represented in this build as It looks like the content of the output JSON file was (We're running openjdk version "1.8.0_402") |
@austin3dickey looking into the output {
'type': 'BenchmarkGroupExecution',
'id': 'b601127ce3d6465f8681656af958831a',
'lang': 'Java',
'name': 'java-micro',
'options': '--iterations=1',
'flags': {'language': 'Java'},
'benchmarkable_id': 'b76013517bf4e33adb5fb141aa0a08a641c7a99c',
'run_id': '766c585f2508499eb71d4fb293495096',
'run_name': 'pull request: 40929',
'machine': 'ursa-thinkcentre-m75q',
'process_pid': 1796466,
'command': 'conbench java-micro --iterations=1 --run-id=$RUN_ID --run-name="$RUN_NAME" --run-reason="$RUN_REASON" --commit=b76013517bf4e33adb5fb141aa0a08a641c7a99c --src=/var/lib/buildkite-agent/builds/ursa-thinkcentre-m75q-1/apache-arrow/arrow-bci-benchmark-on-ursa-thinkcentre-m75q/arrow',
'started_at': '2024-04-02 09:13:48.394338',
'finished_at': '2024-04-02 09:26:09.620209',
'total_run_time': '0:12:21.225871',
'failed': False,
'return_code': 0,
'stderr': '',
'total_machine_virtual_memory': 16142196736
} I haven't actually seen logs before, but probably need to check with a passing benchmark with expected outcome. |
@austin3dickey how to query a older passing benchmark for Java? I haven't done this before, could you please give me a few pointers. The goal is to check if something is wrong in the output. It says |
@vibhatha Unfortunately we never printed the stdout/stderr of the archery process in the past because of the GBs of log output, so we don't have past logs to look at. I think the critical line in that log is line 10377. I see a bunch of failures like the below. Not sure why the process exits 0, because no benchmarks are run due to these failures.
|
That's fairly straightforward then. The logging library is too new for Java 8. (Do we need to enable logging at all?) |
Probably we could fix this issue or disable logs. For benchmarks, wouldn't logs hurt performance? |
According to our docs, if we do not specify a debug level, Arrow Java defaults to DEBUG level log printing.
https://arrow.apache.org/docs/developers/java/development.html#id2 |
We can use an older version then. Though, what I meant is this point
|
Closing this PR since the issue was resolved in another PR. |
Rationale for this change
This PR is continuing the work done in #40786.
What changes are included in this PR?
Improving build configs to resolve a memory issue in benchmarks.
Are these changes tested?
Tested via existing test cases, no explicit tests are added.
TBD if further tests are required.
Are there any user-facing changes?
No
java.lang.OutOfMemoryError
in Java benchmarks after local build cache change #40775