Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/master' into alerting-ui-disab…
Browse files Browse the repository at this point in the history
…le-edit

# Please enter a commit message to explain why this merge is necessary,
# especially if it merges an updated upstream into a topic branch.
#
# Lines starting with '#' will be ignored, and an empty message aborts
# the commit.
  • Loading branch information
YulNaumenko committed May 11, 2020
2 parents 57fb50c + e67480d commit dd2f0e2
Show file tree
Hide file tree
Showing 203 changed files with 3,560 additions and 1,325 deletions.
2 changes: 1 addition & 1 deletion .backportrc.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
],
"targetPRLabels": ["backport"],
"branchLabelMapping": {
"^v7.8.0$": "7.x",
"^v7.9.0$": "7.x",
"^v(\\d+).(\\d+).\\d+$": "$1.$2"
}
}
Binary file added docs/apm/images/apm-service-map-anomaly.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/apm/images/green-service.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/apm/images/red-service.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/apm/images/service-maps.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/apm/images/yellow-service.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
19 changes: 14 additions & 5 deletions docs/apm/machine-learning.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -6,22 +6,31 @@
<titleabbrev>Integrate with machine learning</titleabbrev>
++++

The Machine Learning integration will initiate a new job predefined to calculate anomaly scores on transaction response times.
The response time graph will show the expected bounds and add an annotation when the anomaly score is 75 or above.
Jobs can be created per transaction type, and based on the average response time.
Manage jobs in the *Machine Learning jobs management*.
The Machine Learning integration initiates a new job predefined to calculate anomaly scores on APM transaction durations.
Jobs can be created per transaction type, and are based on the service's average response time.

After a machine learning job is created, results are shown in two places:

The transaction duration graph will show the expected bounds and add an annotation when the anomaly score is 75 or above.

[role="screenshot"]
image::apm/images/apm-ml-integration.png[Example view of anomaly scores on response times in the APM app]

Service maps will display a color-coded anomaly indicator based on the detected anomaly score.

[role="screenshot"]
image::apm/images/apm-ml-integration.png[Example view of anomaly scores on response times in APM app in Kibana]
image::apm/images/apm-service-map-anomaly.png[Example view of anomaly scores on service maps in the APM app]

[float]
[[create-ml-integration]]
=== Create a new machine learning job

To enable machine learning anomaly detection, first choose a service to monitor.
Then, select **Integrations** > **Enable ML anomaly detection** and click **Create job**.

That's it! After a few minutes, the job will begin calculating results;
it might take additional time for results to appear on your graph.
Jobs can be managed in *Machine Learning jobs management*.

APM specific anomaly detection wizards are also available for certain Agents.
See the machine learning {ml-docs}/ootb-ml-jobs-apm.html[APM anomaly detection configurations] for more information.
24 changes: 23 additions & 1 deletion docs/apm/service-maps.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,9 @@ Please use Chrome or Firefox if available.

A service map is a real-time visual representation of the instrumented services in your application's architecture.
It shows you how these services are connected, along with high-level metrics like average transaction duration,
requests per minute, and errors per minute, that allow you to quickly assess the status of your services.
requests per minute, and errors per minute.
If enabled, service maps also integrate with machine learning--for real time health indicators based on anomaly detection scores.
All of these features can help you to quickly and visually assess the status and health of your services.

We currently surface two types of service maps:

Expand Down Expand Up @@ -52,6 +54,26 @@ Additional filters are not currently available for service maps.
[role="screenshot"]
image::apm/images/service-maps-java.png[Example view of service maps with Java highlighted in the APM app in Kibana]

[float]
[[service-map-anomaly-detection]]
=== Anomaly detection with machine learning

Machine learning jobs can be created to calculate anomaly scores on APM transaction durations within the selected service.
When these jobs are active, service maps will display a color-coded anomaly indicator based on the detected anomaly score:

[horizontal]
image:apm/images/green-service.png[APM green service]:: Max anomaly score **<=25**. Service is healthy.
image:apm/images/yellow-service.png[APM yellow service]:: Max anomaly score **26-74**. Anomalous activity detected. Service may be degraded.
image:apm/images/red-service.png[APM red service]:: Max anomaly score **>=75**. Anomalous activity detected. Service is unhealthy.

[role="screenshot"]
image::apm/images/apm-service-map-anomaly.png[Example view of anomaly scores on service maps in the APM app]

If an anomaly has been detected, click *view anomalies* to view the anomaly detection metric viewier in the Machine learning app.
This time series analysis will display additional details on the severity and time of the detected anomalies.

To learn how to create a machine learning job, see <<machine-learning-integration,machine learning integration>>.

[float]
[[service-maps-legend]]
=== Legend
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [kibana-plugin-core-server](./kibana-plugin-core-server.md) &gt; [SavedObjectsMigrationLogger](./kibana-plugin-core-server.savedobjectsmigrationlogger.md) &gt; [error](./kibana-plugin-core-server.savedobjectsmigrationlogger.error.md)

## SavedObjectsMigrationLogger.error property

<b>Signature:</b>

```typescript
error: (msg: string, meta: LogMeta) => void;
```
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ export interface SavedObjectsMigrationLogger
| Property | Type | Description |
| --- | --- | --- |
| [debug](./kibana-plugin-core-server.savedobjectsmigrationlogger.debug.md) | <code>(msg: string) =&gt; void</code> | |
| [error](./kibana-plugin-core-server.savedobjectsmigrationlogger.error.md) | <code>(msg: string, meta: LogMeta) =&gt; void</code> | |
| [info](./kibana-plugin-core-server.savedobjectsmigrationlogger.info.md) | <code>(msg: string) =&gt; void</code> | |
| [warn](./kibana-plugin-core-server.savedobjectsmigrationlogger.warn.md) | <code>(msg: string) =&gt; void</code> | |
| [warning](./kibana-plugin-core-server.savedobjectsmigrationlogger.warning.md) | <code>(msg: string) =&gt; void</code> | |
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [kibana-plugin-plugins-data-server](./kibana-plugin-plugins-data-server.md) &gt; [IIndexPattern](./kibana-plugin-plugins-data-server.iindexpattern.md) &gt; [getTimeField](./kibana-plugin-plugins-data-server.iindexpattern.gettimefield.md)

## IIndexPattern.getTimeField() method

<b>Signature:</b>

```typescript
getTimeField?(): IFieldType | undefined;
```
<b>Returns:</b>
`IFieldType | undefined`
14 changes: 14 additions & 0 deletions docs/visualize/tsvb.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -122,3 +122,17 @@ Edit the source for the Markdown visualization.
. To insert the mustache template variable into the editor, click the variable name.
+
The http://mustache.github.io/mustache.5.html[mustache syntax] uses the Handlebar.js processor, which is an extended version of the Mustache template language.

[float]
[[tsvb-style-markdown]]
==== Style Markdown text

Style your Markdown visualization using http://lesscss.org/features/[less syntax].

. Select *Markdown*.

. Select *Panel options*.

. Enter styling rules in *Custom CSS* section
+
Less in TSVB does not support custom plugins or inline JavaScript.
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@
"leaflet-responsive-popup": "0.6.4",
"leaflet-vega": "^0.8.6",
"leaflet.heat": "0.2.0",
"less": "^2.7.3",
"less": "npm:@elastic/less@2.7.3-kibana",
"less-loader": "5.0.0",
"lodash": "npm:@elastic/lodash@3.10.1-kibana4",
"lodash.clonedeep": "^4.5.0",
Expand Down
6 changes: 3 additions & 3 deletions packages/kbn-optimizer/src/worker/webpack.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -137,9 +137,9 @@ export function getWebpackConfig(bundle: Bundle, worker: WorkerConfig) {
// or which have require() statements that should be ignored because the file is
// already bundled with all its necessary depedencies
noParse: [
/[\///]node_modules[\///]elasticsearch-browser[\///]/,
/[\///]node_modules[\///]lodash[\///]index\.js$/,
/[\///]node_modules[\///]vega-lib[\///]build[\///]vega\.js$/,
/[\/\\]node_modules[\/\\]elasticsearch-browser[\/\\]/,
/[\/\\]node_modules[\/\\]lodash[\/\\]index\.js$/,
/[\/\\]node_modules[\/\\]vega-lib[\/\\]build[\/\\]vega\.js$/,
],

rules: [
Expand Down
26 changes: 13 additions & 13 deletions src/core/server/logging/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ logging:
- context: plugins
appenders: [custom]
level: warn
- context: plugins.pid
- context: plugins.myPlugin
level: info
- context: server
level: fatal
Expand All @@ -180,14 +180,14 @@ logging:
Here is what we get with the config above:
| Context | Appenders | Level |
| ------------- |:------------------------:| -----:|
| root | console, file | error |
| plugins | custom | warn |
| plugins.pid | custom | info |
| server | console, file | fatal |
| optimize | console | error |
| telemetry | json-file-appender | all |
| Context | Appenders | Level |
| ---------------- |:------------------------:| -----:|
| root | console, file | error |
| plugins | custom | warn |
| plugins.myPlugin | custom | info |
| server | console, file | fatal |
| optimize | console | error |
| telemetry | json-file-appender | all |
The `root` logger has a dedicated configuration node since this context is special and should always exist. By
Expand Down Expand Up @@ -259,7 +259,7 @@ define a custom one.
```yaml
logging:
loggers:
- context: your-plugin
- context: plugins.myPlugin
appenders: [console]
```
Logs in a *file* if given file path. You should define a custom appender with `kind: file`
Expand All @@ -273,7 +273,7 @@ logging:
layout:
kind: pattern
loggers:
- context: your-plugin
- context: plugins.myPlugin
appenders: [file]
```
#### logging.json
Expand All @@ -282,10 +282,10 @@ the output format with [layouts](#layouts).

#### logging.quiet
Suppresses all logging output other than error messages. With new logging, config can be achieved
with adjusting minimum required [logging level](#log-level)
with adjusting minimum required [logging level](#log-level).
```yaml
loggers:
- context: my-plugin
- context: plugins.myPlugin
appenders: [console]
level: error
# or for all output
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -293,7 +293,7 @@ describe('DocumentMigrator', () => {
migrationVersion: { dog: '10.2.0' },
})
).toThrow(
/Document "smelly" has property "dog" which belongs to a more recent version of Kibana \(10\.2\.0\)/i
/Document "smelly" has property "dog" which belongs to a more recent version of Kibana \[10\.2\.0\]\. The last known version is \[undefined\]/i
);
});

Expand All @@ -315,7 +315,7 @@ describe('DocumentMigrator', () => {
migrationVersion: { dawg: '1.2.4' },
})
).toThrow(
/Document "fleabag" has property "dawg" which belongs to a more recent version of Kibana \(1\.2\.4\)/i
/Document "fleabag" has property "dawg" which belongs to a more recent version of Kibana \[1\.2\.4\]\. The last known version is \[1\.2\.3\]/i
);
});

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -350,7 +350,7 @@ function nextUnmigratedProp(doc: SavedObjectUnsanitizedDoc, migrations: ActiveMi
if (docVersion && (!latestVersion || Semver.gt(docVersion, latestVersion))) {
throw Boom.badData(
`Document "${doc.id}" has property "${p}" which belongs to a more recent` +
` version of Kibana (${docVersion}).`,
` version of Kibana [${docVersion}]. The last known version is [${latestVersion}]`,
doc
);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ async function migrateSourceToDest(context: Context) {
await Index.write(
callCluster,
dest.indexName,
migrateRawDocs(serializer, documentMigrator.migrate, docs)
migrateRawDocs(serializer, documentMigrator.migrate, docs, log)
);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ import _ from 'lodash';
import { SavedObjectTypeRegistry } from '../../saved_objects_type_registry';
import { SavedObjectsSerializer } from '../../serialization';
import { migrateRawDocs } from './migrate_raw_docs';
import { createSavedObjectsMigrationLoggerMock } from '../../migrations/mocks';

describe('migrateRawDocs', () => {
test('converts raw docs to saved objects', async () => {
Expand All @@ -31,7 +32,8 @@ describe('migrateRawDocs', () => {
[
{ _id: 'a:b', _source: { type: 'a', a: { name: 'AAA' } } },
{ _id: 'c:d', _source: { type: 'c', c: { name: 'DDD' } } },
]
],
createSavedObjectsMigrationLoggerMock()
);

expect(result).toEqual([
Expand All @@ -48,7 +50,8 @@ describe('migrateRawDocs', () => {
expect(transform).toHaveBeenCalled();
});

test('passes invalid docs through untouched', async () => {
test('passes invalid docs through untouched and logs error', async () => {
const logger = createSavedObjectsMigrationLoggerMock();
const transform = jest.fn<any, any>((doc: any) =>
_.set(_.cloneDeep(doc), 'attributes.name', 'TADA')
);
Expand All @@ -58,7 +61,8 @@ describe('migrateRawDocs', () => {
[
{ _id: 'foo:b', _source: { type: 'a', a: { name: 'AAA' } } },
{ _id: 'c:d', _source: { type: 'c', c: { name: 'DDD' } } },
]
],
logger
);

expect(result).toEqual([
Expand All @@ -82,5 +86,7 @@ describe('migrateRawDocs', () => {
},
],
]);

expect(logger.error).toBeCalledTimes(1);
});
});
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@

import { SavedObjectsRawDoc, SavedObjectsSerializer } from '../../serialization';
import { TransformFn } from './document_migrator';
import { SavedObjectsMigrationLogger } from '.';

/**
* Applies the specified migration function to every saved object document in the list
Expand All @@ -35,7 +36,8 @@ import { TransformFn } from './document_migrator';
export function migrateRawDocs(
serializer: SavedObjectsSerializer,
migrateDoc: TransformFn,
rawDocs: SavedObjectsRawDoc[]
rawDocs: SavedObjectsRawDoc[],
log: SavedObjectsMigrationLogger
): SavedObjectsRawDoc[] {
return rawDocs.map(raw => {
if (serializer.isRawSavedObject(raw)) {
Expand All @@ -47,6 +49,10 @@ export function migrateRawDocs(
});
}

log.error(
`Error: Unable to migrate the corrupt Saved Object document ${raw._id}. To prevent Kibana from performing a migration on every restart, please delete or fix this document by ensuring that the namespace and type in the document's id matches the values in the namespace and type fields.`,
{ rawDocument: raw }
);
return raw;
});
}
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,10 @@

import _ from 'lodash';
import { coordinateMigration } from './migration_coordinator';
import { createSavedObjectsMigrationLoggerMock } from '../mocks';

describe('coordinateMigration', () => {
const log = {
debug: jest.fn(),
warning: jest.fn(),
warn: jest.fn(),
info: jest.fn(),
};
const log = createSavedObjectsMigrationLoggerMock();

test('waits for isMigrated, if there is an index conflict', async () => {
const pollInterval = 1;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
* under the License.
*/

import { Logger } from 'src/core/server/logging';
import { Logger, LogMeta } from '../../../logging';

/*
* This file provides a helper class for ensuring that all logging
Expand All @@ -35,6 +35,7 @@ export interface SavedObjectsMigrationLogger {
*/
warning: (msg: string) => void;
warn: (msg: string) => void;
error: (msg: string, meta: LogMeta) => void;
}

export class MigrationLogger implements SavedObjectsMigrationLogger {
Expand All @@ -48,4 +49,5 @@ export class MigrationLogger implements SavedObjectsMigrationLogger {
public debug = (msg: string) => this.logger.debug(msg);
public warning = (msg: string) => this.logger.warn(msg);
public warn = (msg: string) => this.logger.warn(msg);
public error = (msg: string, meta: LogMeta) => this.logger.error(msg, meta);
}
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,9 @@
* (the shape of the mappings and documents in the index).
*/

import { Logger } from 'src/core/server/logging';
import { KibanaConfigType } from 'src/core/server/kibana_config';
import { BehaviorSubject } from 'rxjs';
import { Logger } from '../../../logging';
import { IndexMapping, SavedObjectsTypeMappingDefinitions } from '../../mappings';
import { SavedObjectUnsanitizedDoc, SavedObjectsSerializer } from '../../serialization';
import { docValidator, PropertyValidators } from '../../validation';
Expand Down
Loading

0 comments on commit dd2f0e2

Please sign in to comment.