Skip to content

Commit

Permalink
[Backport 2.x] JSON Catalog Reader for Integrations (opensearch-proje…
Browse files Browse the repository at this point in the history
…ct#1392)

* Add full statics scan method

* Add sample data to serialization method

* Finish integration serialization

* Rename integration reader and add stub json adaptor

* Add JsonCatalogDataAdaptor directory type checking

* Add integration version filtering to json_data_adaptor

* Remove unviable serialization implementation

The existing serialization implementation was not able to recover all data, as mapping information associating file names with the content was lost in appending the file content at the end of the JSON object.
In order to work around this, a new serialization method needs to add this association.
The most viable solution I see is to add encoded file data right next to the file names within the config, which will make it slightly harder to read if word wrap is enabled, but otherwise solves the issue.

Going to clear this method out entirely for now and restore old functionality with TDD.

Alternative solution: We can make a loose association by making heavy assumptions about array ordering and with complicated parsing logic, but I'm not sure it's worth it.
Better to just go back to the drawing board and make a serialization method that's more flexible.

* Add new static serialization

* Add component and sample data serialization

* Fix serialization test for new serialization approach

* Add new asset serialization

* Fix broken compareVersions comparison and version sorting

When I noticed that adaptors had to handle version sorting, I started writing a test to verify the versions were being sorted.
While verifying the test, I noticed a weird result where 2.0.0 was being sorted after both 1.0.0 and 2.1.0.
After moving the comparison function over to integration_reader where it should be and writing tests,
I saw the method was falsely saying 2.0.0 and 2.1.0 were equal.

Debugging the comparison, I found that the version '1.2.3' was being parsed as [1, NaN, NaN].
Several more minutes of banging my head against my desk, and I realized that the suspect was this line:
`a.split('.').map(Number.parseInt)`.
Because parseInt takes two arguments (number and radix), map was giving it the part *and the index*.

The error went away after fixing the method to use only the part.

* Add ability to read config file from json integration

* Add rudimentary integration search to JSONAdaptor

* Create readRawConfig method to avoid data pruning

* Add localized config parsing to getAssets

* Convert getSampleData to use localized config

* Move localized reader logic to helper method

* Convert getSchemas to use new data retrieval

* Add localized config handling for getStatic

* Add basic validation test for JSON reader

* Add static validation to integration reader

* Add more tests for deepCheck on JSON catalog

* Add tests for json adaptor edge cases

* Add reserialization tests

* Add tests and remove redundant error checks for 90% coverage

* Fix lints in repository.test.ts

* Fix lints for integrations_builder.ts

* Update snapshots

* Revert "Update snapshots"

This reverts commit 62b238d.

* Use semver library for comparison

* Switch to upstream NDJson parser

* Add inline documentation for IntegrationPart

* Move deepCheck to utils

* Add further utility methods to utils

* Refactor repository tests to not rely on result.ok == false

---------

(cherry picked from commit 6bd872c)

Signed-off-by: Simeon Widdis <sawiddis@amazon.com>
Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
(cherry picked from commit 216aa92)
  • Loading branch information
opensearch-trigger-bot[bot] authored and A9 Swift Project User committed Feb 1, 2024
1 parent 6e2989f commit d00fd4f
Show file tree
Hide file tree
Showing 19 changed files with 1,350 additions and 430 deletions.
4 changes: 2 additions & 2 deletions auto_sync_commit_metadata.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
{
"last_github_commit": "305e0370bfb0855458bcce84359141284bdf777e",
"last_gitfarm_commit": "065be22ca9fa8f7c20be437de3751e3b3ac9dc0f"
"last_github_commit": "216aa92cd3c78aaa6f0ed5e576cb4d093a3ba4c2",
"last_gitfarm_commit": "91c12f84111af8dd26702a7eb382e538e69caef5"
}
34 changes: 25 additions & 9 deletions server/adaptors/integrations/__test__/builder.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,13 @@

import { SavedObjectsClientContract } from '../../../../../../src/core/server';
import { IntegrationInstanceBuilder } from '../integrations_builder';
import { IntegrationReader } from '../repository/integration';
import { IntegrationReader } from '../repository/integration_reader';
import * as mockUtils from '../repository/utils';

jest.mock('../repository/utils', () => ({
...jest.requireActual('../repository/utils'),
deepCheck: jest.fn(),
}));

const mockSavedObjectsClient: SavedObjectsClientContract = ({
bulkCreate: jest.fn(),
Expand All @@ -17,7 +23,6 @@ const mockSavedObjectsClient: SavedObjectsClientContract = ({
} as unknown) as SavedObjectsClientContract;

const sampleIntegration: IntegrationReader = ({
deepCheck: jest.fn().mockResolvedValue(true),
getAssets: jest.fn().mockResolvedValue({
savedObjects: [
{
Expand Down Expand Up @@ -104,8 +109,12 @@ describe('IntegrationInstanceBuilder', () => {
},
};

jest
.spyOn(mockUtils, 'deepCheck')
.mockResolvedValue({ ok: true, value: mockTemplate as IntegrationConfig });

// Mock the implementation of the methods in the Integration class
sampleIntegration.deepCheck = jest.fn().mockResolvedValue({ ok: true, value: mockTemplate });
// sampleIntegration.deepCheck = jest.fn().mockResolvedValue({ ok: true, value: mockTemplate });
sampleIntegration.getAssets = jest
.fn()
.mockResolvedValue({ ok: true, value: { savedObjects: remappedAssets } });
Expand All @@ -119,7 +128,6 @@ describe('IntegrationInstanceBuilder', () => {

const instance = await builder.build(sampleIntegration, options);

expect(sampleIntegration.deepCheck).toHaveBeenCalled();
expect(sampleIntegration.getAssets).toHaveBeenCalled();
expect(remapIDsSpy).toHaveBeenCalledWith(remappedAssets);
expect(postAssetsSpy).toHaveBeenCalledWith(remappedAssets);
Expand All @@ -131,8 +139,8 @@ describe('IntegrationInstanceBuilder', () => {
dataSource: 'instance-datasource',
name: 'instance-name',
};
sampleIntegration.deepCheck = jest
.fn()
jest
.spyOn(mockUtils, 'deepCheck')
.mockResolvedValue({ ok: false, error: new Error('Mock error') });

await expect(builder.build(sampleIntegration, options)).rejects.toThrowError('Mock error');
Expand All @@ -145,7 +153,9 @@ describe('IntegrationInstanceBuilder', () => {
};

const errorMessage = 'Failed to get assets';
sampleIntegration.deepCheck = jest.fn().mockResolvedValue({ ok: true, value: {} });
jest
.spyOn(mockUtils, 'deepCheck')
.mockResolvedValue({ ok: true, value: ({} as unknown) as IntegrationConfig });
sampleIntegration.getAssets = jest
.fn()
.mockResolvedValue({ ok: false, error: new Error(errorMessage) });
Expand All @@ -165,7 +175,9 @@ describe('IntegrationInstanceBuilder', () => {
},
];
const errorMessage = 'Failed to post assets';
sampleIntegration.deepCheck = jest.fn().mockResolvedValue({ ok: true, value: {} });
jest
.spyOn(mockUtils, 'deepCheck')
.mockResolvedValue({ ok: true, value: ({} as unknown) as IntegrationConfig });
sampleIntegration.getAssets = jest
.fn()
.mockResolvedValue({ ok: true, value: { savedObjects: remappedAssets } });
Expand All @@ -180,10 +192,14 @@ describe('IntegrationInstanceBuilder', () => {
const assets = [
{
id: 'asset1',
type: 'unknown',
attributes: { title: 'asset1' },
references: [{ id: 'ref1' }, { id: 'ref2' }],
},
{
id: 'asset2',
type: 'unknown',
attributes: { title: 'asset1' },
references: [{ id: 'ref1' }, { id: 'ref3' }],
},
];
Expand All @@ -200,7 +216,7 @@ describe('IntegrationInstanceBuilder', () => {

const remappedAssets = builder.remapIDs(assets);

expect(remappedAssets).toEqual(expectedRemappedAssets);
expect(remappedAssets).toMatchObject(expectedRemappedAssets);
});
});

Expand Down
225 changes: 225 additions & 0 deletions server/adaptors/integrations/__test__/json_repository.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,225 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*/

/**
* Serialization tests for integrations in the local repository.
*/

import { TemplateManager } from '../repository/repository';
import { IntegrationReader } from '../repository/integration_reader';
import path from 'path';
import * as fs from 'fs/promises';
import { JsonCatalogDataAdaptor } from '../repository/json_data_adaptor';
import { deepCheck, foldResults } from '../repository/utils';

const fetchSerializedIntegrations = async (): Promise<Result<SerializedIntegration[], Error>> => {
const directory = path.join(__dirname, '../__data__/repository');
const folders = await fs.readdir(directory);
const readers = await Promise.all(
folders.map(async (folder) => {
const integPath = path.join(directory, folder);
if (!(await fs.lstat(integPath)).isDirectory()) {
// If it's not a directory (e.g. a README), skip it
return Promise.resolve(null);
}
// Otherwise, all directories must be integrations
return new IntegrationReader(integPath);
})
);
const serializedIntegrationResults = await Promise.all(
(readers.filter((x) => x !== null) as IntegrationReader[]).map((r) => r.serialize())
);
return foldResults(serializedIntegrationResults);
};

describe('The Local Serialized Catalog', () => {
it('Should serialize without errors', async () => {
const serialized = await fetchSerializedIntegrations();
expect(serialized.ok).toBe(true);
});

it('Should pass deep validation for all serialized integrations', async () => {
const serialized = await fetchSerializedIntegrations();
const repository = new TemplateManager(
'.',
new JsonCatalogDataAdaptor(serialized.value as SerializedIntegration[])
);

for (const integ of await repository.getIntegrationList()) {
const validationResult = await deepCheck(integ);
await expect(validationResult).toHaveProperty('ok', true);
}
});

it('Should correctly retrieve a logo', async () => {
const serialized = await fetchSerializedIntegrations();
const repository = new TemplateManager(
'.',
new JsonCatalogDataAdaptor(serialized.value as SerializedIntegration[])
);
const integration = (await repository.getIntegration('nginx')) as IntegrationReader;
const logoStatic = await integration.getStatic('logo.svg');

expect(logoStatic).toHaveProperty('ok', true);
expect((logoStatic.value as Buffer).length).toBeGreaterThan(1000);
});

it('Should correctly retrieve a gallery image', async () => {
const serialized = await fetchSerializedIntegrations();
const repository = new TemplateManager(
'.',
new JsonCatalogDataAdaptor(serialized.value as SerializedIntegration[])
);
const integration = (await repository.getIntegration('nginx')) as IntegrationReader;
const logoStatic = await integration.getStatic('dashboard1.png');

expect(logoStatic).toHaveProperty('ok', true);
expect((logoStatic.value as Buffer).length).toBeGreaterThan(1000);
});

it('Should correctly retrieve a dark mode logo', async () => {
const TEST_INTEGRATION = 'nginx';
const serialized = await fetchSerializedIntegrations();
const config = (serialized.value as SerializedIntegration[]).filter(
(integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION
)[0];

if (!config.statics) {
throw new Error('NginX integration missing statics (invalid test)');
}
config.statics.darkModeGallery = config.statics.gallery;
config.statics.darkModeLogo = {
...(config.statics.logo as SerializedStaticAsset),
path: 'dark_logo.svg',
};

const reader = new IntegrationReader('nginx', new JsonCatalogDataAdaptor([config]));

await expect(reader.getStatic('dark_logo.svg')).resolves.toHaveProperty('ok', true);
});

it('Should correctly re-serialize', async () => {
const TEST_INTEGRATION = 'nginx';
const serialized = await fetchSerializedIntegrations();
const config = (serialized.value as SerializedIntegration[]).filter(
(integ: { name: string }) => integ.name === TEST_INTEGRATION
)[0];

const reader = new IntegrationReader('nginx', new JsonCatalogDataAdaptor([config]));
const reserialized = await reader.serialize();

expect(reserialized.value).toEqual(config);
});

it('Should correctly re-serialize with dark mode values', async () => {
const TEST_INTEGRATION = 'nginx';
const serialized = await fetchSerializedIntegrations();
const config = (serialized.value as SerializedIntegration[]).filter(
(integ: { name: string }) => integ.name === TEST_INTEGRATION
)[0];

if (!config.statics) {
throw new Error('NginX integration missing statics (invalid test)');
}
config.statics.darkModeGallery = config.statics.gallery;
config.statics.darkModeLogo = {
...(config.statics.logo as SerializedStaticAsset),
path: 'dark_logo.svg',
};

const reader = new IntegrationReader('nginx', new JsonCatalogDataAdaptor([config]));
const reserialized = await reader.serialize();

expect(reserialized.value).toEqual(config);
});
});

describe('Integration validation', () => {
it('Should correctly fail an integration without schemas', async () => {
const TEST_INTEGRATION = 'nginx';
const serialized = await fetchSerializedIntegrations();
const transformedSerialized = (serialized.value as SerializedIntegration[])
.filter((integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION)
.map((integ) => {
return {
...integ,
components: [] as SerializedIntegrationComponent[],
};
});
const integration = new IntegrationReader(
TEST_INTEGRATION,
new JsonCatalogDataAdaptor(transformedSerialized)
);

await expect(deepCheck(integration)).resolves.toHaveProperty('ok', false);
});

it('Should correctly fail an integration without assets', async () => {
const TEST_INTEGRATION = 'nginx';
const serialized = await fetchSerializedIntegrations();
const transformedSerialized = (serialized.value as SerializedIntegration[])
.filter((integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION)
.map((integ) => {
return {
...integ,
assets: {} as SerializedIntegrationAssets,
};
});
const integration = new IntegrationReader(
TEST_INTEGRATION,
new JsonCatalogDataAdaptor(transformedSerialized)
);

await expect(deepCheck(integration)).resolves.toHaveProperty('ok', false);
});
});

describe('JSON Catalog with invalid data', () => {
it('Should report an error if images are missing data', async () => {
const TEST_INTEGRATION = 'nginx';
const serialized = await fetchSerializedIntegrations();
const baseConfig = (serialized.value as SerializedIntegration[]).filter(
(integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION
)[0];

if (!baseConfig.statics) {
throw new Error('NginX integration missing statics (invalid test)');
}

baseConfig.statics = {
logo: { path: 'logo.svg' } as SerializedStaticAsset,
darkModeLogo: { path: 'dm_logo.svg' } as SerializedStaticAsset,
gallery: [{ path: '1.png' }] as SerializedStaticAsset[],
darkModeGallery: [{ path: 'dm_1.png' }] as SerializedStaticAsset[],
};
const reader = new IntegrationReader(
TEST_INTEGRATION,
new JsonCatalogDataAdaptor([baseConfig])
);

await expect(reader.getStatic('logo.svg')).resolves.toHaveProperty('ok', false);
await expect(reader.getStatic('dm_logo.svg')).resolves.toHaveProperty('ok', false);
await expect(reader.getStatic('1.png')).resolves.toHaveProperty('ok', false);
await expect(reader.getStatic('dm_1.png')).resolves.toHaveProperty('ok', false);
});

it('Should report an error on read if a schema has invalid JSON', async () => {
const TEST_INTEGRATION = 'nginx';
const serialized = await fetchSerializedIntegrations();
const baseConfig = (serialized.value as SerializedIntegration[]).filter(
(integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION
)[0];

expect(baseConfig.components.length).toBeGreaterThanOrEqual(2);
baseConfig.components[1].data = '{"invalid_json": true';

const reader = new IntegrationReader(
TEST_INTEGRATION,
new JsonCatalogDataAdaptor([baseConfig])
);

await expect(reader.getSchemas()).resolves.toHaveProperty('ok', false);
});
});
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,15 @@
* SPDX-License-Identifier: Apache-2.0
*/

/**
* This file is used as integration tests for Integrations Repository functionality.
*/

import { TemplateManager } from '../repository/repository';
import { IntegrationReader } from '../repository/integration';
import { IntegrationReader } from '../repository/integration_reader';
import path from 'path';
import * as fs from 'fs/promises';
import { deepCheck } from '../repository/utils';

describe('The local repository', () => {
it('Should only contain valid integration directories or files.', async () => {
Expand All @@ -21,7 +26,7 @@ describe('The local repository', () => {
}
// Otherwise, all directories must be integrations
const integ = new IntegrationReader(integPath);
expect(integ.getConfig()).resolves.toHaveProperty('ok', true);
await expect(integ.getConfig()).resolves.toHaveProperty('ok', true);
})
);
});
Expand All @@ -33,7 +38,7 @@ describe('The local repository', () => {
const integrations: IntegrationReader[] = await repository.getIntegrationList();
await Promise.all(
integrations.map(async (i) => {
const result = await i.deepCheck();
const result = await deepCheck(i);
if (!result.ok) {
console.error(result.error);
}
Expand All @@ -42,3 +47,28 @@ describe('The local repository', () => {
);
});
});

describe('Local Nginx Integration', () => {
it('Should serialize without errors', async () => {
const repository: TemplateManager = new TemplateManager(
path.join(__dirname, '../__data__/repository')
);
const integration = await repository.getIntegration('nginx');

await expect(integration?.serialize()).resolves.toHaveProperty('ok', true);
});

it('Should serialize to include the config', async () => {
const repository: TemplateManager = new TemplateManager(
path.join(__dirname, '../__data__/repository')
);
const integration = await repository.getIntegration('nginx');
const config = await integration!.getConfig();
const serialized = await integration!.serialize();

expect(serialized).toHaveProperty('ok', true);
expect((serialized as { value: object }).value).toMatchObject(
(config as { value: object }).value
);
});
});
Loading

0 comments on commit d00fd4f

Please sign in to comment.