forked from opensearch-project/dashboards-observability
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Backport 2.x] JSON Catalog Reader for Integrations (opensearch-proje…
…ct#1392) * Add full statics scan method * Add sample data to serialization method * Finish integration serialization * Rename integration reader and add stub json adaptor * Add JsonCatalogDataAdaptor directory type checking * Add integration version filtering to json_data_adaptor * Remove unviable serialization implementation The existing serialization implementation was not able to recover all data, as mapping information associating file names with the content was lost in appending the file content at the end of the JSON object. In order to work around this, a new serialization method needs to add this association. The most viable solution I see is to add encoded file data right next to the file names within the config, which will make it slightly harder to read if word wrap is enabled, but otherwise solves the issue. Going to clear this method out entirely for now and restore old functionality with TDD. Alternative solution: We can make a loose association by making heavy assumptions about array ordering and with complicated parsing logic, but I'm not sure it's worth it. Better to just go back to the drawing board and make a serialization method that's more flexible. * Add new static serialization * Add component and sample data serialization * Fix serialization test for new serialization approach * Add new asset serialization * Fix broken compareVersions comparison and version sorting When I noticed that adaptors had to handle version sorting, I started writing a test to verify the versions were being sorted. While verifying the test, I noticed a weird result where 2.0.0 was being sorted after both 1.0.0 and 2.1.0. After moving the comparison function over to integration_reader where it should be and writing tests, I saw the method was falsely saying 2.0.0 and 2.1.0 were equal. Debugging the comparison, I found that the version '1.2.3' was being parsed as [1, NaN, NaN]. Several more minutes of banging my head against my desk, and I realized that the suspect was this line: `a.split('.').map(Number.parseInt)`. Because parseInt takes two arguments (number and radix), map was giving it the part *and the index*. The error went away after fixing the method to use only the part. * Add ability to read config file from json integration * Add rudimentary integration search to JSONAdaptor * Create readRawConfig method to avoid data pruning * Add localized config parsing to getAssets * Convert getSampleData to use localized config * Move localized reader logic to helper method * Convert getSchemas to use new data retrieval * Add localized config handling for getStatic * Add basic validation test for JSON reader * Add static validation to integration reader * Add more tests for deepCheck on JSON catalog * Add tests for json adaptor edge cases * Add reserialization tests * Add tests and remove redundant error checks for 90% coverage * Fix lints in repository.test.ts * Fix lints for integrations_builder.ts * Update snapshots * Revert "Update snapshots" This reverts commit 62b238d. * Use semver library for comparison * Switch to upstream NDJson parser * Add inline documentation for IntegrationPart * Move deepCheck to utils * Add further utility methods to utils * Refactor repository tests to not rely on result.ok == false --------- (cherry picked from commit 6bd872c) Signed-off-by: Simeon Widdis <sawiddis@amazon.com> Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com> Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com> (cherry picked from commit 216aa92)
- Loading branch information
1 parent
6e2989f
commit d00fd4f
Showing
19 changed files
with
1,350 additions
and
430 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,4 @@ | ||
{ | ||
"last_github_commit": "305e0370bfb0855458bcce84359141284bdf777e", | ||
"last_gitfarm_commit": "065be22ca9fa8f7c20be437de3751e3b3ac9dc0f" | ||
"last_github_commit": "216aa92cd3c78aaa6f0ed5e576cb4d093a3ba4c2", | ||
"last_gitfarm_commit": "91c12f84111af8dd26702a7eb382e538e69caef5" | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
225 changes: 225 additions & 0 deletions
225
server/adaptors/integrations/__test__/json_repository.test.ts
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,225 @@ | ||
/* | ||
* Copyright OpenSearch Contributors | ||
* SPDX-License-Identifier: Apache-2.0 | ||
*/ | ||
|
||
/** | ||
* Serialization tests for integrations in the local repository. | ||
*/ | ||
|
||
import { TemplateManager } from '../repository/repository'; | ||
import { IntegrationReader } from '../repository/integration_reader'; | ||
import path from 'path'; | ||
import * as fs from 'fs/promises'; | ||
import { JsonCatalogDataAdaptor } from '../repository/json_data_adaptor'; | ||
import { deepCheck, foldResults } from '../repository/utils'; | ||
|
||
const fetchSerializedIntegrations = async (): Promise<Result<SerializedIntegration[], Error>> => { | ||
const directory = path.join(__dirname, '../__data__/repository'); | ||
const folders = await fs.readdir(directory); | ||
const readers = await Promise.all( | ||
folders.map(async (folder) => { | ||
const integPath = path.join(directory, folder); | ||
if (!(await fs.lstat(integPath)).isDirectory()) { | ||
// If it's not a directory (e.g. a README), skip it | ||
return Promise.resolve(null); | ||
} | ||
// Otherwise, all directories must be integrations | ||
return new IntegrationReader(integPath); | ||
}) | ||
); | ||
const serializedIntegrationResults = await Promise.all( | ||
(readers.filter((x) => x !== null) as IntegrationReader[]).map((r) => r.serialize()) | ||
); | ||
return foldResults(serializedIntegrationResults); | ||
}; | ||
|
||
describe('The Local Serialized Catalog', () => { | ||
it('Should serialize without errors', async () => { | ||
const serialized = await fetchSerializedIntegrations(); | ||
expect(serialized.ok).toBe(true); | ||
}); | ||
|
||
it('Should pass deep validation for all serialized integrations', async () => { | ||
const serialized = await fetchSerializedIntegrations(); | ||
const repository = new TemplateManager( | ||
'.', | ||
new JsonCatalogDataAdaptor(serialized.value as SerializedIntegration[]) | ||
); | ||
|
||
for (const integ of await repository.getIntegrationList()) { | ||
const validationResult = await deepCheck(integ); | ||
await expect(validationResult).toHaveProperty('ok', true); | ||
} | ||
}); | ||
|
||
it('Should correctly retrieve a logo', async () => { | ||
const serialized = await fetchSerializedIntegrations(); | ||
const repository = new TemplateManager( | ||
'.', | ||
new JsonCatalogDataAdaptor(serialized.value as SerializedIntegration[]) | ||
); | ||
const integration = (await repository.getIntegration('nginx')) as IntegrationReader; | ||
const logoStatic = await integration.getStatic('logo.svg'); | ||
|
||
expect(logoStatic).toHaveProperty('ok', true); | ||
expect((logoStatic.value as Buffer).length).toBeGreaterThan(1000); | ||
}); | ||
|
||
it('Should correctly retrieve a gallery image', async () => { | ||
const serialized = await fetchSerializedIntegrations(); | ||
const repository = new TemplateManager( | ||
'.', | ||
new JsonCatalogDataAdaptor(serialized.value as SerializedIntegration[]) | ||
); | ||
const integration = (await repository.getIntegration('nginx')) as IntegrationReader; | ||
const logoStatic = await integration.getStatic('dashboard1.png'); | ||
|
||
expect(logoStatic).toHaveProperty('ok', true); | ||
expect((logoStatic.value as Buffer).length).toBeGreaterThan(1000); | ||
}); | ||
|
||
it('Should correctly retrieve a dark mode logo', async () => { | ||
const TEST_INTEGRATION = 'nginx'; | ||
const serialized = await fetchSerializedIntegrations(); | ||
const config = (serialized.value as SerializedIntegration[]).filter( | ||
(integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION | ||
)[0]; | ||
|
||
if (!config.statics) { | ||
throw new Error('NginX integration missing statics (invalid test)'); | ||
} | ||
config.statics.darkModeGallery = config.statics.gallery; | ||
config.statics.darkModeLogo = { | ||
...(config.statics.logo as SerializedStaticAsset), | ||
path: 'dark_logo.svg', | ||
}; | ||
|
||
const reader = new IntegrationReader('nginx', new JsonCatalogDataAdaptor([config])); | ||
|
||
await expect(reader.getStatic('dark_logo.svg')).resolves.toHaveProperty('ok', true); | ||
}); | ||
|
||
it('Should correctly re-serialize', async () => { | ||
const TEST_INTEGRATION = 'nginx'; | ||
const serialized = await fetchSerializedIntegrations(); | ||
const config = (serialized.value as SerializedIntegration[]).filter( | ||
(integ: { name: string }) => integ.name === TEST_INTEGRATION | ||
)[0]; | ||
|
||
const reader = new IntegrationReader('nginx', new JsonCatalogDataAdaptor([config])); | ||
const reserialized = await reader.serialize(); | ||
|
||
expect(reserialized.value).toEqual(config); | ||
}); | ||
|
||
it('Should correctly re-serialize with dark mode values', async () => { | ||
const TEST_INTEGRATION = 'nginx'; | ||
const serialized = await fetchSerializedIntegrations(); | ||
const config = (serialized.value as SerializedIntegration[]).filter( | ||
(integ: { name: string }) => integ.name === TEST_INTEGRATION | ||
)[0]; | ||
|
||
if (!config.statics) { | ||
throw new Error('NginX integration missing statics (invalid test)'); | ||
} | ||
config.statics.darkModeGallery = config.statics.gallery; | ||
config.statics.darkModeLogo = { | ||
...(config.statics.logo as SerializedStaticAsset), | ||
path: 'dark_logo.svg', | ||
}; | ||
|
||
const reader = new IntegrationReader('nginx', new JsonCatalogDataAdaptor([config])); | ||
const reserialized = await reader.serialize(); | ||
|
||
expect(reserialized.value).toEqual(config); | ||
}); | ||
}); | ||
|
||
describe('Integration validation', () => { | ||
it('Should correctly fail an integration without schemas', async () => { | ||
const TEST_INTEGRATION = 'nginx'; | ||
const serialized = await fetchSerializedIntegrations(); | ||
const transformedSerialized = (serialized.value as SerializedIntegration[]) | ||
.filter((integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION) | ||
.map((integ) => { | ||
return { | ||
...integ, | ||
components: [] as SerializedIntegrationComponent[], | ||
}; | ||
}); | ||
const integration = new IntegrationReader( | ||
TEST_INTEGRATION, | ||
new JsonCatalogDataAdaptor(transformedSerialized) | ||
); | ||
|
||
await expect(deepCheck(integration)).resolves.toHaveProperty('ok', false); | ||
}); | ||
|
||
it('Should correctly fail an integration without assets', async () => { | ||
const TEST_INTEGRATION = 'nginx'; | ||
const serialized = await fetchSerializedIntegrations(); | ||
const transformedSerialized = (serialized.value as SerializedIntegration[]) | ||
.filter((integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION) | ||
.map((integ) => { | ||
return { | ||
...integ, | ||
assets: {} as SerializedIntegrationAssets, | ||
}; | ||
}); | ||
const integration = new IntegrationReader( | ||
TEST_INTEGRATION, | ||
new JsonCatalogDataAdaptor(transformedSerialized) | ||
); | ||
|
||
await expect(deepCheck(integration)).resolves.toHaveProperty('ok', false); | ||
}); | ||
}); | ||
|
||
describe('JSON Catalog with invalid data', () => { | ||
it('Should report an error if images are missing data', async () => { | ||
const TEST_INTEGRATION = 'nginx'; | ||
const serialized = await fetchSerializedIntegrations(); | ||
const baseConfig = (serialized.value as SerializedIntegration[]).filter( | ||
(integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION | ||
)[0]; | ||
|
||
if (!baseConfig.statics) { | ||
throw new Error('NginX integration missing statics (invalid test)'); | ||
} | ||
|
||
baseConfig.statics = { | ||
logo: { path: 'logo.svg' } as SerializedStaticAsset, | ||
darkModeLogo: { path: 'dm_logo.svg' } as SerializedStaticAsset, | ||
gallery: [{ path: '1.png' }] as SerializedStaticAsset[], | ||
darkModeGallery: [{ path: 'dm_1.png' }] as SerializedStaticAsset[], | ||
}; | ||
const reader = new IntegrationReader( | ||
TEST_INTEGRATION, | ||
new JsonCatalogDataAdaptor([baseConfig]) | ||
); | ||
|
||
await expect(reader.getStatic('logo.svg')).resolves.toHaveProperty('ok', false); | ||
await expect(reader.getStatic('dm_logo.svg')).resolves.toHaveProperty('ok', false); | ||
await expect(reader.getStatic('1.png')).resolves.toHaveProperty('ok', false); | ||
await expect(reader.getStatic('dm_1.png')).resolves.toHaveProperty('ok', false); | ||
}); | ||
|
||
it('Should report an error on read if a schema has invalid JSON', async () => { | ||
const TEST_INTEGRATION = 'nginx'; | ||
const serialized = await fetchSerializedIntegrations(); | ||
const baseConfig = (serialized.value as SerializedIntegration[]).filter( | ||
(integ: { name: string; components: unknown[] }) => integ.name === TEST_INTEGRATION | ||
)[0]; | ||
|
||
expect(baseConfig.components.length).toBeGreaterThanOrEqual(2); | ||
baseConfig.components[1].data = '{"invalid_json": true'; | ||
|
||
const reader = new IntegrationReader( | ||
TEST_INTEGRATION, | ||
new JsonCatalogDataAdaptor([baseConfig]) | ||
); | ||
|
||
await expect(reader.getSchemas()).resolves.toHaveProperty('ok', false); | ||
}); | ||
}); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.