-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support performance presets in the Elasticsearch output #37259
Conversation
This pull request does not have a backport label.
To fixup this pull request, you need to add the backport labels for the needed
|
💚 Build Succeeded
Expand to view the summary
Build stats
❕ Flaky test reportNo test was executed to be analysed. 🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
buildkite test it |
💔 Tests Failed
Expand to view the summary
Build stats
Test stats 🧪
Test errors
Expand to view the tests failures
|
Still adding docs, but marking ready for review since the code and tests are done |
❕ Build Aborted
Expand to view the summary
Build stats
🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
// config overrides. | ||
// Returns a list of the user fields that were overwritten. | ||
func applyPreset(preset string, userConfig *config.C) ([]string, error) { | ||
presetConfig := presetConfigs[preset] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might want to do presetConfig, ok :=
to check if the preset exists?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's what the presetConfig == nil
check on the next line is for, the ok
check wouldn't catch anything that nil
doesn't already
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to log all the values used for a preset when a preset is applied? We can just log the contents of presetConfigs
for example.
Going to test this manually before approving besides some logging suggestions this looks good.
❕ Build Aborted
Expand to view the summary
Build stats
🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
Built Beats from this branch, copied them into an 8.12.0-SNAPSHOT agent package, then enrolled that agent with an 8.12.0-SNAPSHOT stateful deployment on cloud. I then added two custom output parameters
Do we need to make modifications in the control protocol output reload path as well? beats/x-pack/libbeat/management/managerV2.go Lines 733 to 742 in ea90a1d
|
💚 Build Succeeded
Expand to view the summary
Build stats
❕ Flaky test reportNo test was executed to be analysed. 🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
❕ Build Aborted
Expand to view the summary
Build stats
Test stats 🧪
Test errors
Expand to view the tests failures
|
💚 Build Succeeded
Expand to view the summary
Build stats
❕ Flaky test reportNo test was executed to be analysed. 🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
Possibly, but so far I think the issue is elsewhere. I can reproduce your problem, but some expected log messages seem to be missing even when the component is definitely restarted. I'm troubleshooting now with some extra-verbose custom builds. |
I've verified (via incredibly verbose logging) that the presets are seen and correctly updates when changed in a live policy. However, none of the log messages at "warn" level are visible, which might be what was throwing us off. Is there some reason those might be getting filtered? |
Found the real problem: our config library is doing some sort of tricky internal namespace tracking. It doesn't affect which flags are set / applied, but it does affect our override checking: when we call Not all of the config helpers work this way: rendering the full config to a string doesn't include this prefix, and when we call |
❕ Build Aborted
Expand to view the summary
Build stats
🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
❕ Build Aborted
Expand to view the summary
Build stats
🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
"I'll just include the default/reference config changes in this PR too. This is surely not hubris and I will definitely not regret doing this the day before feature freeze," I thought yesterday. But anyway now all the CI checks are failing even though my local I'm planning to leave the other docs updates for a followup PR, to avoid similar issues. However, the problem from yesterday seems fixed, verified in manual testing, and I added a unit test for the specific workaround I had to add -- this should be ready for final review, subject to making CI happy. |
❕ Build Aborted
Expand to view the summary
Build stats
🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
❕ Build Aborted
Expand to view the summary
Build stats
🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Config template changes LGTM. Only nit about a deleted module file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This module removal intentional? Didn't see a mention about the removal in the commit or description, but I may have missed it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm -- not intentional on my part, but it looks like it was one of the side effects of doing a make update
on the full repo. Looking closer, I see that there's an identical file already in that directory without the -xpack
suffix, and that every other module in that directory has only a single config file with no -xpack
suffix, so I think this behavior is correct and maybe it got missed because the CI's make check
wasn't strict enough to enforce that regeneration. Good catch, thanks :-)
💚 Build Succeeded
Expand to view the summary
Build stats
❕ Flaky test reportNo test was executed to be analysed. 🤖 GitHub commentsExpand to view the GitHub comments
To re-run your PR in the CI, just comment with:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested again with Fleet, works as expected. Thanks! 🚢
Adds the performance presets described in elastic/elastic-agent#3797 to the Elasticsearch output, configurable with the `preset` field.
Proposed commit message
Adds the performance presets described in elastic/elastic-agent#3797 to the Elasticsearch output, configurable with the
preset
field.Checklist
I have made corresponding changes to the documentation(docs in followup PR since they aren't subject to feature freeze)CHANGELOG.next.asciidoc
orCHANGELOG-developer.next.asciidoc
.How to test this PR locally
Live config flags aren't directly exposed, but there are a few ways to confirm what's happening on a live run:
preset: balanced
to the Elasticsearch output config, and confirm that the resulting log files have the messageApplying performance preset 'balanced'
bulk_max_size: 500
, and confirm that the logs have the messageSetting 'bulk_max_size' is ignored because of performance preset
throughput
preset with a large input load, and confirm that the process uses more than 1 CPU.Related issues