Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Store mean for response-related metrics #683

Merged

Conversation

danielmitterdorfer
Copy link
Member

With this commit we calculate and store the mean of response-related
metrics (i.e. throughput, service time and latency). This can be useful
because mean sample values tend to be normally distributed even if the
population distribution (the underlying distribution that is represented
by each individual mean) is not. This is also known as the central limit
theorem. This is useful because certain statistical tests, like the
t-test, assume normally distributed values.

With this commit we calculate and store the mean of response-related
metrics (i.e. throughput, service time and latency). This can be useful
because mean sample values tend to be normally distributed even if the
population distribution (the underlying distribution that is represented
by each individual mean) is not. This is also known as the central limit
theorem. This is useful because certain statistical tests, like the
t-test, assume normally distributed values.
@danielmitterdorfer danielmitterdorfer added enhancement Improves the status quo :Metrics How metrics are stored, calculated or aggregated labels Apr 18, 2019
@danielmitterdorfer danielmitterdorfer added this to the 1.1.0 milestone Apr 18, 2019
@ebadyano
Copy link
Contributor

Do we also want to display it in the report or is it enough to just store it in the metric store?

Copy link
Contributor

@ebadyano ebadyano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks!

@danielmitterdorfer
Copy link
Member Author

Do we also want to display it in the report or is it enough to just store it in the metric store?

For now it's intentionally only stored but not shown in the summary report. First, we'd make the summary report even larger and second this is primarily meant for users that want to run statistical significance tests (which I'd expect to use Elasticsearch as a metrics store already). If we decide to display it in the summary report, I suggest we do it in a follow-up PR.

@danielmitterdorfer danielmitterdorfer merged commit ae3e14a into elastic:master Apr 18, 2019
@danielmitterdorfer danielmitterdorfer deleted the add-mean-to-metrics branch April 18, 2019 16:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Improves the status quo :Metrics How metrics are stored, calculated or aggregated
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants