diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index f3a89032552f7..7237a21700add 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -1 +1 @@ -* @reta @anasalkouz @andrross @reta @Bukhtawar @CEHENKLE @dblock @gbbafna @setiah @kartg @kotwanikunal @mch2 @nknize @owaiskazi19 @Rishikesh1159 @ryanbogan @saratvemulapalli @shwetathareja @dreamer-89 @tlfeng @VachaShah @xuezhou25 +* @reta @anasalkouz @andrross @reta @Bukhtawar @CEHENKLE @dblock @gbbafna @setiah @kartg @kotwanikunal @mch2 @nknize @owaiskazi19 @Rishikesh1159 @ryanbogan @saratvemulapalli @shwetathareja @dreamer-89 @tlfeng @VachaShah @dbwiddis diff --git a/CHANGELOG.md b/CHANGELOG.md index 14a74df481894..80e944260ca41 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -84,15 +84,19 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), - [Search Pipelines] Accept pipelines defined in search source ([#7253](https://github.com/opensearch-project/OpenSearch/pull/7253)) - [Search Pipelines] Add `default_search_pipeline` index setting ([#7470](https://github.com/opensearch-project/OpenSearch/pull/7470)) - [Search Pipelines] Add RenameFieldResponseProcessor for Search Pipelines ([#7377](https://github.com/opensearch-project/OpenSearch/pull/7377)) +- [Search Pipelines] Split search pipeline processor factories by type ([#7597](https://github.com/opensearch-project/OpenSearch/pull/7597)) +- [Search Pipelines] Add script processor ([#7607](https://github.com/opensearch-project/OpenSearch/pull/7607)) - Add descending order search optimization through reverse segment read. ([#7244](https://github.com/opensearch-project/OpenSearch/pull/7244)) - Add 'unsigned_long' numeric field type ([#6237](https://github.com/opensearch-project/OpenSearch/pull/6237)) - Add back primary shard preference for queries ([#7375](https://github.com/opensearch-project/OpenSearch/pull/7375)) -- Add descending order search optimization through reverse segment read. ([#7244](https://github.com/opensearch-project/OpenSearch/pull/7244)) +- Add task cancellation timestamp in task API ([#7455](https://github.com/opensearch-project/OpenSearch/pull/7455)) - Adds ExtensionsManager.lookupExtensionSettingsById ([#7466](https://github.com/opensearch-project/OpenSearch/pull/7466)) - SegRep with Remote: Add hook for publishing checkpoint notifications after segment upload to remote store ([#7394](https://github.com/opensearch-project/OpenSearch/pull/7394)) +- Add search_after query optimizations with shard/segment short cutting ([#7453](https://github.com/opensearch-project/OpenSearch/pull/7453)) - Provide mechanism to configure XContent parsing constraints (after update to Jackson 2.15.0 and above) ([#7550](https://github.com/opensearch-project/OpenSearch/pull/7550)) - Support to clear filecache using clear indices cache API ([#7498](https://github.com/opensearch-project/OpenSearch/pull/7498)) - Create NamedRoute to map extension routes to a shortened name ([#6870](https://github.com/opensearch-project/OpenSearch/pull/6870)) +- Added @dbwiddis as on OpenSearch maintainer ([#7665](https://github.com/opensearch-project/OpenSearch/pull/7665)) ### Dependencies - Bump `com.netflix.nebula:gradle-info-plugin` from 12.0.0 to 12.1.3 (#7564) @@ -103,7 +107,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), - Bump `com.netflix.nebula.ospackage-base` from 11.0.0 to 11.3.0 - Bump `gradle.plugin.com.github.johnrengelman:shadow` from 7.1.2 to 8.0.0 - Bump `jna` from 5.11.0 to 5.13.0 -- Bump `commons-io:commons-io` from 2.7 to 2.11.0 +- Bump `commons-io:commons-io` from 2.7 to 2.12.0 (#7661, #7658, #7656) - Bump `org.apache.shiro:shiro-core` from 1.9.1 to 1.11.0 ([#7397](https://github.com/opensearch-project/OpenSearch/pull/7397)) - Bump `jetty-server` in hdfs-fixture from 9.4.49.v20220914 to 9.4.51.v20230217 ([#7405](https://github.com/opensearch-project/OpenSearch/pull/7405)) - OpenJDK Update (April 2023 Patch releases) ([#7448](https://github.com/opensearch-project/OpenSearch/pull/7448) @@ -116,12 +120,17 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), - Bump `com.azure:azure-storage-common` from 12.20.0 to 12.21.0 (#7566) - Bump `org.apache.commons:commons-compress` from 1.22 to 1.23.0 (#7563) - Bump `jackson` from 2.15.0 to 2.15.1 ([#7603](https://github.com/opensearch-project/OpenSearch/pull/7603)) +- Bump `net.minidev:json-smart` from 2.4.10 to 2.4.11 (#7660) +- Bump `io.projectreactor.netty:reactor-netty-core` from 1.1.5 to 1.1.7 (#7657) +- Bump `org.apache.maven:maven-model` from 3.9.1 to 3.9.2 (#7655) +- Bump `com.google.api:gax` from 2.17.0 to 2.27.0 (#7697) ### Changed - Enable `./gradlew build` on MacOS by disabling bcw tests ([#7303](https://github.com/opensearch-project/OpenSearch/pull/7303)) - Moved concurrent-search from sandbox plugin to server module behind feature flag ([#7203](https://github.com/opensearch-project/OpenSearch/pull/7203)) - Allow access to indices cache clear APIs for read only indexes ([#7303](https://github.com/opensearch-project/OpenSearch/pull/7303)) - Changed concurrent-search threadpool type to be resizable and support task resource tracking ([#7502](https://github.com/opensearch-project/OpenSearch/pull/7502)) +- Default search preference to _primary for searchable snapshot indices ([#7628](https://github.com/opensearch-project/OpenSearch/pull/7628)) ### Deprecated diff --git a/MAINTAINERS.md b/MAINTAINERS.md index daabf8c26c8ac..e05590fb705e7 100644 --- a/MAINTAINERS.md +++ b/MAINTAINERS.md @@ -5,12 +5,13 @@ This document contains a list of maintainers in this repo. See [opensearch-proje ## Current Maintainers | Maintainer | GitHub ID | Affiliation | -| ------------------------ | ------------------------------------------------------- | ----------- | +|--------------------------| ------------------------------------------------------- | ----------- | | Anas Alkouz | [anasalkouz](https://github.com/anasalkouz) | Amazon | | Andrew Ross | [andrross](https://github.com/andrross) | Amazon | | Andriy Redko | [reta](https://github.com/reta) | Aiven | | Bukhtawar Khan | [Bukhtawar](https://github.com/Bukhtawar) | Amazon | | Charlotte Henkle | [CEHENKLE](https://github.com/CEHENKLE) | Amazon | +| Dan Widdis | [dbwiddis](https://github.com/dbwiddis) | Amazon | | Daniel "dB." Doubrovkine | [dblock](https://github.com/dblock) | Amazon | | Gaurav Bafna | [gbbafna](https://github.com/gbbafna) | Amazon | | Himanshu Setia | [setiah](https://github.com/setiah) | Amazon | @@ -26,7 +27,6 @@ This document contains a list of maintainers in this repo. See [opensearch-proje | Suraj Singh | [dreamer-89](https://github.com/dreamer-89) | Amazon | | Tianli Feng | [tlfeng](https://github.com/tlfeng) | Amazon | | Vacha Shah | [VachaShah](https://github.com/VachaShah) | Amazon | -| Xue Zhou | [xuezhou25](https://github.com/xuezhou25) | Amazon | ## Emeritus @@ -35,3 +35,4 @@ This document contains a list of maintainers in this repo. See [opensearch-proje | Abbas Hussain | [abbashus](https://github.com/abbashus) | Amazon | | Megha Sai Kavikondala | [meghasaik](https://github.com/meghasaik) | Amazon | | Rabi Panda | [adnapibar](https://github.com/adnapibar) | Amazon | +| Xue Zhou | [xuezhou25](https://github.com/xuezhou25) | Amazon | diff --git a/buildSrc/build.gradle b/buildSrc/build.gradle index 32020f00ef7ae..b58652e599ec8 100644 --- a/buildSrc/build.gradle +++ b/buildSrc/build.gradle @@ -117,7 +117,7 @@ dependencies { api 'de.thetaphi:forbiddenapis:3.5.1' api 'com.avast.gradle:gradle-docker-compose-plugin:0.16.12' api "org.yaml:snakeyaml:${props.getProperty('snakeyaml')}" - api 'org.apache.maven:maven-model:3.9.1' + api 'org.apache.maven:maven-model:3.9.2' api 'com.networknt:json-schema-validator:1.0.81' api "com.fasterxml.jackson.core:jackson-databind:${props.getProperty('jackson_databind')}" diff --git a/client/rest-high-level/src/test/java/org/opensearch/client/core/tasks/GetTaskResponseTests.java b/client/rest-high-level/src/test/java/org/opensearch/client/core/tasks/GetTaskResponseTests.java index 3c0250c23ccae..a9b3591c08330 100644 --- a/client/rest-high-level/src/test/java/org/opensearch/client/core/tasks/GetTaskResponseTests.java +++ b/client/rest-high-level/src/test/java/org/opensearch/client/core/tasks/GetTaskResponseTests.java @@ -95,6 +95,10 @@ static TaskInfo randomTaskInfo() { boolean cancellable = randomBoolean(); boolean cancelled = cancellable == true ? randomBoolean() : false; TaskId parentTaskId = randomBoolean() ? TaskId.EMPTY_TASK_ID : randomTaskId(); + Long cancellationStartTime = null; + if (cancelled) { + cancellationStartTime = randomNonNegativeLong(); + } Map headers = randomBoolean() ? Collections.emptyMap() : Collections.singletonMap(randomAlphaOfLength(5), randomAlphaOfLength(5)); @@ -110,7 +114,8 @@ static TaskInfo randomTaskInfo() { cancelled, parentTaskId, headers, - randomResourceStats() + randomResourceStats(), + cancellationStartTime ); } diff --git a/modules/lang-painless/src/yamlRestTest/resources/rest-api-spec/test/painless/71_context_api.yml b/modules/lang-painless/src/yamlRestTest/resources/rest-api-spec/test/painless/71_context_api.yml index 0413661fc586c..478ca9ae8abf4 100644 --- a/modules/lang-painless/src/yamlRestTest/resources/rest-api-spec/test/painless/71_context_api.yml +++ b/modules/lang-painless/src/yamlRestTest/resources/rest-api-spec/test/painless/71_context_api.yml @@ -2,7 +2,7 @@ - do: scripts_painless_context: {} - match: { contexts.0: aggregation_selector} - - match: { contexts.22: update} + - match: { contexts.23: update} --- "Action to get all API values for score context": diff --git a/modules/search-pipeline-common/build.gradle b/modules/search-pipeline-common/build.gradle index cc655e10ada92..fe3c097ff6886 100644 --- a/modules/search-pipeline-common/build.gradle +++ b/modules/search-pipeline-common/build.gradle @@ -15,9 +15,11 @@ apply plugin: 'opensearch.internal-cluster-test' opensearchplugin { description 'Module for search pipeline processors that do not require additional security permissions or have large dependencies and resources' classname 'org.opensearch.search.pipeline.common.SearchPipelineCommonModulePlugin' + extendedPlugins = ['lang-painless'] } dependencies { + compileOnly project(':modules:lang-painless') } restResources { diff --git a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/FilterQueryRequestProcessor.java b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/FilterQueryRequestProcessor.java index 81c00012daec6..7deb8faa03af6 100644 --- a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/FilterQueryRequestProcessor.java +++ b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/FilterQueryRequestProcessor.java @@ -40,6 +40,11 @@ public class FilterQueryRequestProcessor extends AbstractProcessor implements Se final QueryBuilder filterQuery; + /** + * Returns the type of the processor. + * + * @return The processor type. + */ @Override public String getType() { return TYPE; @@ -57,6 +62,14 @@ public FilterQueryRequestProcessor(String tag, String description, QueryBuilder this.filterQuery = filterQuery; } + /** + * Modifies the search request by adding a filtered query to the existing query, if any, and sets it as the new query + * in the search request's SearchSourceBuilder. + * + * @param request The search request to be processed. + * @return The modified search request. + * @throws Exception if an error occurs while processing the request. + */ @Override public SearchRequest processRequest(SearchRequest request) throws Exception { QueryBuilder originalQuery = null; @@ -75,7 +88,7 @@ public SearchRequest processRequest(SearchRequest request) throws Exception { return request; } - static class Factory implements Processor.Factory { + static class Factory implements Processor.Factory { private final NamedXContentRegistry namedXContentRegistry; public static final ParseField QUERY_FIELD = new ParseField("query"); @@ -85,7 +98,7 @@ static class Factory implements Processor.Factory { @Override public FilterQueryRequestProcessor create( - Map processorFactories, + Map> processorFactories, String tag, String description, Map config diff --git a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/RenameFieldResponseProcessor.java b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/RenameFieldResponseProcessor.java index 3a2f0e9fb2492..4c40dda5928f0 100644 --- a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/RenameFieldResponseProcessor.java +++ b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/RenameFieldResponseProcessor.java @@ -128,7 +128,7 @@ public SearchResponse processResponse(SearchRequest request, SearchResponse resp /** * This is a factor that creates the RenameResponseProcessor */ - public static final class Factory implements Processor.Factory { + public static final class Factory implements Processor.Factory { /** * Constructor for factory @@ -137,7 +137,7 @@ public static final class Factory implements Processor.Factory { @Override public RenameFieldResponseProcessor create( - Map processorFactories, + Map> processorFactories, String tag, String description, Map config diff --git a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/ScriptRequestProcessor.java b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/ScriptRequestProcessor.java new file mode 100644 index 0000000000000..015411e0701a4 --- /dev/null +++ b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/ScriptRequestProcessor.java @@ -0,0 +1,182 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ + +package org.opensearch.search.pipeline.common; + +import org.opensearch.action.search.SearchRequest; + +import org.opensearch.common.Nullable; +import org.opensearch.common.bytes.BytesReference; +import org.opensearch.common.xcontent.LoggingDeprecationHandler; +import org.opensearch.core.xcontent.NamedXContentRegistry; +import org.opensearch.core.xcontent.XContentBuilder; +import org.opensearch.core.xcontent.XContentParser; +import org.opensearch.common.xcontent.XContentType; +import org.opensearch.common.xcontent.json.JsonXContent; + +import org.opensearch.script.Script; +import org.opensearch.script.ScriptException; +import org.opensearch.script.ScriptService; +import org.opensearch.script.ScriptType; +import org.opensearch.script.SearchScript; +import org.opensearch.search.pipeline.Processor; +import org.opensearch.search.pipeline.SearchRequestProcessor; +import org.opensearch.search.pipeline.common.helpers.SearchRequestMap; + +import java.io.InputStream; +import java.util.Arrays; +import java.util.Map; + +import static org.opensearch.ingest.ConfigurationUtils.newConfigurationException; + +/** + * Processor that evaluates a script with a search request in its context + * and then returns the modified search request. + */ +public final class ScriptRequestProcessor extends AbstractProcessor implements SearchRequestProcessor { + /** + * Key to reference this processor type from a search pipeline. + */ + public static final String TYPE = "script"; + + private final Script script; + private final ScriptService scriptService; + private final SearchScript precompiledSearchScript; + + /** + * Processor that evaluates a script with a search request in its context + * + * @param tag The processor's tag. + * @param description The processor's description. + * @param script The {@link Script} to execute. + * @param precompiledSearchScript The {@link Script} precompiled + * @param scriptService The {@link ScriptService} used to execute the script. + */ + ScriptRequestProcessor( + String tag, + String description, + Script script, + @Nullable SearchScript precompiledSearchScript, + ScriptService scriptService + ) { + super(tag, description); + this.script = script; + this.precompiledSearchScript = precompiledSearchScript; + this.scriptService = scriptService; + } + + /** + * Executes the script with the search request in context. + * + * @param request The search request passed into the script context. + * @return The modified search request. + * @throws Exception if an error occurs while processing the request. + */ + @Override + public SearchRequest processRequest(SearchRequest request) throws Exception { + // assert request is not null and source is not null + if (request == null || request.source() == null) { + throw new IllegalArgumentException("search request must not be null"); + } + final SearchScript searchScript; + if (precompiledSearchScript == null) { + SearchScript.Factory factory = scriptService.compile(script, SearchScript.CONTEXT); + searchScript = factory.newInstance(script.getParams()); + } else { + searchScript = precompiledSearchScript; + } + // execute the script with the search request in context + searchScript.execute(Map.of("_source", new SearchRequestMap(request))); + return request; + } + + /** + * Returns the type of the processor. + * + * @return The processor type. + */ + @Override + public String getType() { + return TYPE; + } + + /** + * Returns the script used by the processor. + * + * @return The script. + */ + Script getScript() { + return script; + } + + /** + * Returns the precompiled search script used by the processor. + * + * @return The precompiled search script. + */ + SearchScript getPrecompiledSearchScript() { + return precompiledSearchScript; + } + + /** + * Factory class for creating {@link ScriptRequestProcessor}. + */ + public static final class Factory implements Processor.Factory { + private final ScriptService scriptService; + + /** + * Constructs a new Factory instance with the specified {@link ScriptService}. + * + * @param scriptService The {@link ScriptService} used to execute scripts. + */ + public Factory(ScriptService scriptService) { + this.scriptService = scriptService; + } + + /** + * Creates a new instance of {@link ScriptRequestProcessor}. + * + * @param registry The registry of processor factories. + * @param processorTag The processor's tag. + * @param description The processor's description. + * @param config The configuration options for the processor. + * @return The created {@link ScriptRequestProcessor} instance. + * @throws Exception if an error occurs during the creation process. + */ + @Override + public ScriptRequestProcessor create( + Map> registry, + String processorTag, + String description, + Map config + ) throws Exception { + try ( + XContentBuilder builder = XContentBuilder.builder(JsonXContent.jsonXContent).map(config); + InputStream stream = BytesReference.bytes(builder).streamInput(); + XContentParser parser = XContentType.JSON.xContent() + .createParser(NamedXContentRegistry.EMPTY, LoggingDeprecationHandler.INSTANCE, stream) + ) { + Script script = Script.parse(parser); + + Arrays.asList("id", "source", "inline", "lang", "params", "options").forEach(config::remove); + + // verify script is able to be compiled before successfully creating processor. + SearchScript searchScript = null; + try { + final SearchScript.Factory factory = scriptService.compile(script, SearchScript.CONTEXT); + if (ScriptType.INLINE.equals(script.getType())) { + searchScript = factory.newInstance(script.getParams()); + } + } catch (ScriptException e) { + throw newConfigurationException(TYPE, processorTag, null, e); + } + return new ScriptRequestProcessor(processorTag, description, script, searchScript, scriptService); + } + } + } +} diff --git a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/SearchPipelineCommonModulePlugin.java b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/SearchPipelineCommonModulePlugin.java index a0e5182f71443..dc25de460fdba 100644 --- a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/SearchPipelineCommonModulePlugin.java +++ b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/SearchPipelineCommonModulePlugin.java @@ -11,6 +11,8 @@ import org.opensearch.plugins.Plugin; import org.opensearch.plugins.SearchPipelinePlugin; import org.opensearch.search.pipeline.Processor; +import org.opensearch.search.pipeline.SearchRequestProcessor; +import org.opensearch.search.pipeline.SearchResponseProcessor; import java.util.Map; @@ -24,13 +26,24 @@ public class SearchPipelineCommonModulePlugin extends Plugin implements SearchPi */ public SearchPipelineCommonModulePlugin() {} + /** + * Returns a map of processor factories. + * + * @param parameters The parameters required for creating the processor factories. + * @return A map of processor factories, where the keys are the processor types and the values are the corresponding factory instances. + */ @Override - public Map getProcessors(Processor.Parameters parameters) { + public Map> getRequestProcessors(Processor.Parameters parameters) { return Map.of( FilterQueryRequestProcessor.TYPE, new FilterQueryRequestProcessor.Factory(parameters.namedXContentRegistry), - RenameFieldResponseProcessor.TYPE, - new RenameFieldResponseProcessor.Factory() + ScriptRequestProcessor.TYPE, + new ScriptRequestProcessor.Factory(parameters.scriptService) ); } + + @Override + public Map> getResponseProcessors(Processor.Parameters parameters) { + return Map.of(RenameFieldResponseProcessor.TYPE, new RenameFieldResponseProcessor.Factory()); + } } diff --git a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMap.java b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMap.java new file mode 100644 index 0000000000000..7af3ac66be146 --- /dev/null +++ b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMap.java @@ -0,0 +1,395 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ + +package org.opensearch.search.pipeline.common.helpers; + +import org.opensearch.action.search.SearchRequest; +import org.opensearch.search.builder.SearchSourceBuilder; + +import java.util.Collection; +import java.util.Map; +import java.util.Set; +import java.util.function.BiConsumer; +import java.util.function.BiFunction; +import java.util.function.Function; + +/** + * A custom implementation of {@link Map} that provides access to the properties of a {@link SearchRequest}'s + * {@link SearchSourceBuilder}. The class allows retrieving and modifying specific properties of the search request. + */ +public class SearchRequestMap implements Map { + private static final String UNSUPPORTED_OP_ERR = " Method not supported in Search pipeline script"; + + private final SearchSourceBuilder source; + + /** + * Constructs a new instance of the {@link SearchRequestMap} with the provided {@link SearchRequest}. + * + * @param searchRequest The SearchRequest containing the SearchSourceBuilder to be accessed. + */ + public SearchRequestMap(SearchRequest searchRequest) { + source = searchRequest.source(); + } + + /** + * Retrieves the number of properties in the SearchSourceBuilder. + * + * @return The number of properties in the SearchSourceBuilder. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public int size() { + throw new UnsupportedOperationException("size" + UNSUPPORTED_OP_ERR); + } + + /** + * Checks if the SearchSourceBuilder is empty. + * + * @return {@code true} if the SearchSourceBuilder is empty, {@code false} otherwise. + */ + @Override + public boolean isEmpty() { + return source == null; + } + + /** + * Checks if the SearchSourceBuilder contains the specified property. + * + * @param key The property to check for. + * @return {@code true} if the SearchSourceBuilder contains the specified property, {@code false} otherwise. + */ + @Override + public boolean containsKey(Object key) { + return get(key) != null; + } + + /** + * Checks if the SearchSourceBuilder contains the specified value. + * + * @param value The value to check for. + * @return {@code true} if the SearchSourceBuilder contains the specified value, {@code false} otherwise. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public boolean containsValue(Object value) { + throw new UnsupportedOperationException("containsValue" + UNSUPPORTED_OP_ERR); + } + + /** + * Retrieves the value associated with the specified property from the SearchSourceBuilder. + * + * @param key The SearchSourceBuilder property whose value is to be retrieved. + * @return The value associated with the specified property or null if the property has not been initialized. + * @throws IllegalArgumentException if the property name is not a String. + * @throws SearchRequestMapProcessingException if the property is not supported. + */ + @Override + public Object get(Object key) { + if (!(key instanceof String)) { + throw new IllegalArgumentException("key must be a String"); + } + // This is the explicit implementation of fetch value from source + switch ((String) key) { + case "from": + return source.from(); + case "size": + return source.size(); + case "explain": + return source.explain(); + case "version": + return source.version(); + case "seq_no_primary_term": + return source.seqNoAndPrimaryTerm(); + case "track_scores": + return source.trackScores(); + case "track_total_hits": + return source.trackTotalHitsUpTo(); + case "min_score": + return source.minScore(); + case "terminate_after": + return source.terminateAfter(); + case "profile": + return source.profile(); + default: + throw new SearchRequestMapProcessingException("Unsupported key: " + key); + } + } + + /** + * Sets the value for the specified property in the SearchSourceBuilder. + * + * @param key The property whose value is to be set. + * @param value The value to be set for the specified property. + * @return The original value associated with the property, or null if none existed. + * @throws IllegalArgumentException if the property is not a String. + * @throws SearchRequestMapProcessingException if the property is not supported or an error occurs during the setting. + */ + @Override + public Object put(String key, Object value) { + Object originalValue = get(key); + try { + switch (key) { + case "from": + source.from((Integer) value); + break; + case "size": + source.size((Integer) value); + break; + case "explain": + source.explain((Boolean) value); + break; + case "version": + source.version((Boolean) value); + break; + case "seq_no_primary_term": + source.seqNoAndPrimaryTerm((Boolean) value); + break; + case "track_scores": + source.trackScores((Boolean) value); + break; + case "track_total_hits": + source.trackTotalHitsUpTo((Integer) value); + break; + case "min_score": + source.minScore((Float) value); + break; + case "terminate_after": + source.terminateAfter((Integer) value); + break; + case "profile": + source.profile((Boolean) value); + break; + case "stats": // Not modifying stats, sorts, docvalue_fields, etc. as they require more complex handling + case "sort": + case "timeout": + case "docvalue_fields": + case "indices_boost": + default: + throw new SearchRequestMapProcessingException("Unsupported SearchRequest source property: " + key); + } + } catch (Exception e) { + throw new SearchRequestMapProcessingException("Error while setting value for SearchRequest source property: " + key, e); + } + return originalValue; + } + + /** + * Removes the specified property from the SearchSourceBuilder. + * + * @param key The name of the property that will be removed. + * @return The value associated with the property before it was removed, or null if the property was not found. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Object remove(Object key) { + throw new UnsupportedOperationException("remove" + UNSUPPORTED_OP_ERR); + } + + /** + * Sets all the properties from the specified map to the SearchSourceBuilder. + * + * @param m The map containing the properties to be set. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public void putAll(Map m) { + throw new UnsupportedOperationException("putAll" + UNSUPPORTED_OP_ERR); + } + + /** + * Removes all properties from the SearchSourceBuilder. + * + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public void clear() { + throw new UnsupportedOperationException("clear" + UNSUPPORTED_OP_ERR); + } + + /** + * Returns a set view of the property names in the SearchSourceBuilder. + * + * @return A set view of the property names in the SearchSourceBuilder. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Set keySet() { + throw new UnsupportedOperationException("keySet" + UNSUPPORTED_OP_ERR); + } + + /** + * Returns a collection view of the property values in the SearchSourceBuilder. + * + * @return A collection view of the property values in the SearchSourceBuilder. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Collection values() { + throw new UnsupportedOperationException("values" + UNSUPPORTED_OP_ERR); + } + + /** + * Returns a set view of the properties in the SearchSourceBuilder. + * + * @return A set view of the properties in the SearchSourceBuilder. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Set> entrySet() { + throw new UnsupportedOperationException("entrySet" + UNSUPPORTED_OP_ERR); + } + + /** + * Returns the value to which the specified property has, or the defaultValue if the property is not present in the + * SearchSourceBuilder. + * + * @param key The property whose associated value is to be returned. + * @param defaultValue The default value to be returned if the property is not present. + * @return The value to which the specified property has, or the defaultValue if the property is not present. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Object getOrDefault(Object key, Object defaultValue) { + throw new UnsupportedOperationException("getOrDefault" + UNSUPPORTED_OP_ERR); + } + + /** + * Performs the given action for each property in the SearchSourceBuilder until all properties have been processed or the + * action throws an exception + * + * @param action The action to be performed for each property. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public void forEach(BiConsumer action) { + throw new UnsupportedOperationException("forEach" + UNSUPPORTED_OP_ERR); + } + + /** + * Replaces each property's value with the result of invoking the given function on that property until all properties have + * been processed or the function throws an exception. + * + * @param function The function to apply to each property. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public void replaceAll(BiFunction function) { + throw new UnsupportedOperationException("replaceAll" + UNSUPPORTED_OP_ERR); + } + + /** + * If the specified property is not already associated with a value, associates it with the given value and returns null, + * else returns the current value. + * + * @param key The property whose value is to be set if absent. + * @param value The value to be associated with the specified property. + * @return The current value associated with the property, or null if the property is not present. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Object putIfAbsent(String key, Object value) { + throw new UnsupportedOperationException("putIfAbsent" + UNSUPPORTED_OP_ERR); + } + + /** + * Removes the property only if it has the given value. + * + * @param key The property to be removed. + * @param value The value expected to be associated with the property. + * @return {@code true} if the entry was removed, {@code false} otherwise. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public boolean remove(Object key, Object value) { + throw new UnsupportedOperationException("remove" + UNSUPPORTED_OP_ERR); + } + + /** + * Replaces the specified property only if it has the given value. + * + * @param key The property to be replaced. + * @param oldValue The value expected to be associated with the property. + * @param newValue The value to be associated with the property. + * @return {@code true} if the property was replaced, {@code false} otherwise. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public boolean replace(String key, Object oldValue, Object newValue) { + throw new UnsupportedOperationException("replace" + UNSUPPORTED_OP_ERR); + } + + /** + * Replaces the specified property only if it has the given value. + * + * @param key The property to be replaced. + * @param value The value to be associated with the property. + * @return The previous value associated with the property, or null if the property was not found. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Object replace(String key, Object value) { + throw new UnsupportedOperationException("replace" + UNSUPPORTED_OP_ERR); + } + + /** + * The computed value associated with the property, or null if the property is not present. + * + * @param key The property whose value is to be computed if absent. + * @param mappingFunction The function to compute a value based on the property. + * @return The computed value associated with the property, or null if the property is not present. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Object computeIfAbsent(String key, Function mappingFunction) { + throw new UnsupportedOperationException("computeIfAbsent" + UNSUPPORTED_OP_ERR); + } + + /** + * If the value for the specified property is present, attempts to compute a new mapping given the property and its current + * mapped value. + * + * @param key The property for which the mapping is to be computed. + * @param remappingFunction The function to compute a new mapping. + * @return The new value associated with the property, or null if the property is not present. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Object computeIfPresent(String key, BiFunction remappingFunction) { + throw new UnsupportedOperationException("computeIfPresent" + UNSUPPORTED_OP_ERR); + } + + /** + * If the value for the specified property is present, attempts to compute a new mapping given the property and its current + * mapped value, or removes the property if the computed value is null. + * + * @param key The property for which the mapping is to be computed. + * @param remappingFunction The function to compute a new mapping. + * @return The new value associated with the property, or null if the property is not present. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Object compute(String key, BiFunction remappingFunction) { + throw new UnsupportedOperationException("compute" + UNSUPPORTED_OP_ERR); + } + + /** + * If the specified property is not already associated with a value or is associated with null, associates it with the + * given non-null value. Otherwise, replaces the associated value with the results of applying the given + * remapping function to the current and new values. + * + * @param key The property for which the mapping is to be merged. + * @param value The non-null value to be merged with the existing value. + * @param remappingFunction The function to merge the existing and new values. + * @return The new value associated with the property, or null if the property is not present. + * @throws UnsupportedOperationException always, as the method is not supported. + */ + @Override + public Object merge(String key, Object value, BiFunction remappingFunction) { + throw new UnsupportedOperationException("merge" + UNSUPPORTED_OP_ERR); + } +} diff --git a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMapProcessingException.java b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMapProcessingException.java new file mode 100644 index 0000000000000..cb1e45a20b624 --- /dev/null +++ b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMapProcessingException.java @@ -0,0 +1,39 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ + +package org.opensearch.search.pipeline.common.helpers; + +import org.opensearch.OpenSearchException; +import org.opensearch.OpenSearchWrapperException; + +/** + * An exception that indicates an error occurred while processing a {@link SearchRequestMap}. + */ +public class SearchRequestMapProcessingException extends OpenSearchException implements OpenSearchWrapperException { + + /** + * Constructs a new SearchRequestMapProcessingException with the specified message. + * + * @param msg The error message. + * @param args Arguments to substitute in the error message. + */ + public SearchRequestMapProcessingException(String msg, Object... args) { + super(msg, args); + } + + /** + * Constructs a new SearchRequestMapProcessingException with the specified message and cause. + * + * @param msg The error message. + * @param cause The cause of the exception. + * @param args Arguments to substitute in the error message. + */ + public SearchRequestMapProcessingException(String msg, Throwable cause, Object... args) { + super(msg, cause, args); + } +} diff --git a/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/package-info.java b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/package-info.java new file mode 100644 index 0000000000000..b960ff72a9e2b --- /dev/null +++ b/modules/search-pipeline-common/src/main/java/org/opensearch/search/pipeline/common/helpers/package-info.java @@ -0,0 +1,12 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ + +/** + * Provides helper classes and utilities for working with search pipeline processors. + */ +package org.opensearch.search.pipeline.common.helpers; diff --git a/modules/search-pipeline-common/src/test/java/org/opensearch/search/pipeline/common/ScriptRequestProcessorTests.java b/modules/search-pipeline-common/src/test/java/org/opensearch/search/pipeline/common/ScriptRequestProcessorTests.java new file mode 100644 index 0000000000000..2fb3b2345e7e2 --- /dev/null +++ b/modules/search-pipeline-common/src/test/java/org/opensearch/search/pipeline/common/ScriptRequestProcessorTests.java @@ -0,0 +1,131 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ + +package org.opensearch.search.pipeline.common; + +import org.junit.Before; +import org.opensearch.action.search.SearchRequest; +import org.opensearch.common.unit.TimeValue; +import org.opensearch.common.settings.Settings; +import org.opensearch.script.MockScriptEngine; +import org.opensearch.script.Script; +import org.opensearch.script.ScriptModule; +import org.opensearch.script.ScriptService; +import org.opensearch.script.SearchScript; +import org.opensearch.script.ScriptType; +import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.search.pipeline.common.helpers.SearchRequestMap; +import org.opensearch.test.OpenSearchTestCase; + +import java.util.Collections; +import java.util.Map; +import java.util.HashMap; + +import static org.hamcrest.core.Is.is; +import java.util.concurrent.TimeUnit; + +public class ScriptRequestProcessorTests extends OpenSearchTestCase { + + private ScriptService scriptService; + private Script script; + private SearchScript searchScript; + + @Before + public void setupScripting() { + String scriptName = "search_script"; + scriptService = new ScriptService( + Settings.builder().build(), + Map.of(Script.DEFAULT_SCRIPT_LANG, new MockScriptEngine(Script.DEFAULT_SCRIPT_LANG, Map.of(scriptName, ctx -> { + Object sourceObj = ctx.get("_source"); + if (sourceObj instanceof Map) { + Map source = (SearchRequestMap) sourceObj; + + // Update all modifiable source fields + Integer from = (Integer) source.get("from"); + source.put("from", from + 10); + + Integer size = (Integer) source.get("size"); + source.put("size", size + 10); + + Boolean explain = (Boolean) source.get("explain"); + source.put("explain", !explain); + + Boolean version = (Boolean) source.get("version"); + source.put("version", !version); + + Boolean seqNoAndPrimaryTerm = (Boolean) source.get("seq_no_primary_term"); + source.put("seq_no_primary_term", !seqNoAndPrimaryTerm); + + Boolean trackScores = (Boolean) source.get("track_scores"); + source.put("track_scores", !trackScores); + + Integer trackTotalHitsUpTo = (Integer) source.get("track_total_hits"); + source.put("track_total_hits", trackTotalHitsUpTo + 1); + + Float minScore = (Float) source.get("min_score"); + source.put("min_score", minScore + 1.0f); + + Integer terminateAfter = (Integer) source.get("terminate_after"); + source.put("terminate_after", terminateAfter + 1); + } + return null; + }), Collections.emptyMap())), + new HashMap<>(ScriptModule.CORE_CONTEXTS) + ); + script = new Script(ScriptType.INLINE, Script.DEFAULT_SCRIPT_LANG, scriptName, Collections.emptyMap()); + searchScript = scriptService.compile(script, SearchScript.CONTEXT).newInstance(script.getParams()); + } + + public void testScriptingWithoutPrecompiledScriptFactory() throws Exception { + ScriptRequestProcessor processor = new ScriptRequestProcessor(randomAlphaOfLength(10), null, script, null, scriptService); + SearchRequest searchRequest = new SearchRequest(); + searchRequest.source(createSearchSourceBuilder()); + + assertNotNull(searchRequest); + processor.processRequest(searchRequest); + assertSearchRequest(searchRequest); + } + + public void testScriptingWithPrecompiledIngestScript() throws Exception { + ScriptRequestProcessor processor = new ScriptRequestProcessor(randomAlphaOfLength(10), null, script, searchScript, scriptService); + SearchRequest searchRequest = new SearchRequest(); + searchRequest.source(createSearchSourceBuilder()); + + assertNotNull(searchRequest); + processor.processRequest(searchRequest); + assertSearchRequest(searchRequest); + } + + private SearchSourceBuilder createSearchSourceBuilder() { + SearchSourceBuilder source = new SearchSourceBuilder(); + source.from(10); + source.size(20); + source.explain(true); + source.version(true); + source.seqNoAndPrimaryTerm(true); + source.trackScores(true); + source.trackTotalHitsUpTo(3); + source.minScore(1.0f); + source.timeout(new TimeValue(60, TimeUnit.SECONDS)); + source.terminateAfter(5); + return source; + } + + private void assertSearchRequest(SearchRequest searchRequest) { + assertThat(searchRequest.source().from(), is(20)); + assertThat(searchRequest.source().size(), is(30)); + assertThat(searchRequest.source().explain(), is(false)); + assertThat(searchRequest.source().version(), is(false)); + assertThat(searchRequest.source().seqNoAndPrimaryTerm(), is(false)); + assertThat(searchRequest.source().trackScores(), is(false)); + assertThat(searchRequest.source().trackTotalHitsUpTo(), is(4)); + assertThat(searchRequest.source().minScore(), is(2.0f)); + assertThat(searchRequest.source().timeout(), is(new TimeValue(60, TimeUnit.SECONDS))); + assertThat(searchRequest.source().terminateAfter(), is(6)); + } +} diff --git a/modules/search-pipeline-common/src/test/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMapTests.java b/modules/search-pipeline-common/src/test/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMapTests.java new file mode 100644 index 0000000000000..5572f28335e1c --- /dev/null +++ b/modules/search-pipeline-common/src/test/java/org/opensearch/search/pipeline/common/helpers/SearchRequestMapTests.java @@ -0,0 +1,149 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ +package org.opensearch.search.pipeline.common.helpers; + +import org.opensearch.action.search.SearchRequest; +import org.opensearch.search.builder.SearchSourceBuilder; +import org.opensearch.test.AbstractBuilderTestCase; + +public class SearchRequestMapTests extends AbstractBuilderTestCase { + + public void testEmptyMap() { + SearchRequest searchRequest = new SearchRequest(); + SearchRequestMap map = new SearchRequestMap(searchRequest); + + assertTrue(map.isEmpty()); + } + + public void testGet() { + SearchRequest searchRequest = new SearchRequest(); + SearchSourceBuilder source = new SearchSourceBuilder(); + source.from(10); + source.size(20); + source.explain(true); + source.version(true); + source.seqNoAndPrimaryTerm(true); + source.trackScores(true); + source.trackTotalHitsUpTo(3); + source.minScore(1.0f); + source.terminateAfter(5); + searchRequest.source(source); + + SearchRequestMap map = new SearchRequestMap(searchRequest); + + assertEquals(10, map.get("from")); + assertEquals(20, map.get("size")); + assertEquals(true, map.get("explain")); + assertEquals(true, map.get("version")); + assertEquals(true, map.get("seq_no_primary_term")); + assertEquals(true, map.get("track_scores")); + assertEquals(3, map.get("track_total_hits")); + assertEquals(1.0f, map.get("min_score")); + assertEquals(5, map.get("terminate_after")); + } + + public void testPut() { + SearchRequest searchRequest = new SearchRequest(); + SearchSourceBuilder source = new SearchSourceBuilder(); + searchRequest.source(source); + + SearchRequestMap map = new SearchRequestMap(searchRequest); + + assertEquals(-1, map.put("from", 10)); + assertEquals(10, map.get("from")); + + assertEquals(-1, map.put("size", 20)); + assertEquals(20, map.get("size")); + + assertNull(map.put("explain", true)); + assertEquals(true, map.get("explain")); + + assertNull(map.put("version", true)); + assertEquals(true, map.get("version")); + + assertNull(map.put("seq_no_primary_term", true)); + assertEquals(true, map.get("seq_no_primary_term")); + + assertEquals(false, map.put("track_scores", true)); + assertEquals(true, map.get("track_scores")); + + assertNull(map.put("track_total_hits", 3)); + assertEquals(3, map.get("track_total_hits")); + + assertNull(map.put("min_score", 1.0f)); + assertEquals(1.0f, map.get("min_score")); + + assertEquals(0, map.put("terminate_after", 5)); + assertEquals(5, map.get("terminate_after")); + } + + public void testUnsupportedOperationException() { + SearchRequest searchRequest = new SearchRequest(); + SearchSourceBuilder source = new SearchSourceBuilder(); + searchRequest.source(source); + + SearchRequestMap map = new SearchRequestMap(searchRequest); + + assertThrows(UnsupportedOperationException.class, () -> map.size()); + assertThrows(UnsupportedOperationException.class, () -> map.containsValue(null)); + assertThrows(UnsupportedOperationException.class, () -> map.remove(null)); + assertThrows(UnsupportedOperationException.class, () -> map.putAll(null)); + assertThrows(UnsupportedOperationException.class, map::clear); + assertThrows(UnsupportedOperationException.class, map::keySet); + assertThrows(UnsupportedOperationException.class, map::values); + assertThrows(UnsupportedOperationException.class, map::entrySet); + assertThrows(UnsupportedOperationException.class, () -> map.getOrDefault(null, null)); + assertThrows(UnsupportedOperationException.class, () -> map.forEach(null)); + assertThrows(UnsupportedOperationException.class, () -> map.replaceAll(null)); + assertThrows(UnsupportedOperationException.class, () -> map.putIfAbsent(null, null)); + assertThrows(UnsupportedOperationException.class, () -> map.remove(null, null)); + assertThrows(UnsupportedOperationException.class, () -> map.replace(null, null, null)); + assertThrows(UnsupportedOperationException.class, () -> map.replace(null, null)); + assertThrows(UnsupportedOperationException.class, () -> map.computeIfAbsent(null, null)); + assertThrows(UnsupportedOperationException.class, () -> map.computeIfPresent(null, null)); + assertThrows(UnsupportedOperationException.class, () -> map.compute(null, null)); + assertThrows(UnsupportedOperationException.class, () -> map.merge(null, null, null)); + } + + public void testIllegalArgumentException() { + SearchRequest searchRequest = new SearchRequest(); + SearchSourceBuilder source = new SearchSourceBuilder(); + searchRequest.source(source); + + SearchRequestMap map = new SearchRequestMap(searchRequest); + + try { + map.get(1); + fail("Expected IllegalArgumentException"); + } catch (IllegalArgumentException e) { + // expected + } + } + + public void testSearchRequestMapProcessingException() { + SearchRequest searchRequest = new SearchRequest(); + SearchSourceBuilder source = new SearchSourceBuilder(); + searchRequest.source(source); + + SearchRequestMap map = new SearchRequestMap(searchRequest); + + try { + map.get("unsupported_key"); + fail("Expected SearchRequestMapProcessingException"); + } catch (SearchRequestMapProcessingException e) { + // expected + } + + try { + map.put("unsupported_key", 10); + fail("Expected SearchRequestMapProcessingException"); + } catch (SearchRequestMapProcessingException e) { + // expected + } + } +} diff --git a/modules/search-pipeline-common/src/yamlRestTest/resources/rest-api-spec/test/search_pipeline/10_basic.yml b/modules/search-pipeline-common/src/yamlRestTest/resources/rest-api-spec/test/search_pipeline/10_basic.yml index 0d931f8587664..ca53f6cd6a7e8 100644 --- a/modules/search-pipeline-common/src/yamlRestTest/resources/rest-api-spec/test/search_pipeline/10_basic.yml +++ b/modules/search-pipeline-common/src/yamlRestTest/resources/rest-api-spec/test/search_pipeline/10_basic.yml @@ -12,5 +12,6 @@ nodes.info: {} - contains: { nodes.$cluster_manager.modules: { name: search-pipeline-common } } - - contains: { nodes.$cluster_manager.search_pipelines.processors: { type: filter_query } } - - contains: { nodes.$cluster_manager.search_pipelines.processors: { type: rename_field } } + - contains: { nodes.$cluster_manager.search_pipelines.request_processors: { type: filter_query } } + - contains: { nodes.$cluster_manager.search_pipelines.request_processors: { type: script } } + - contains: { nodes.$cluster_manager.search_pipelines.response_processors: { type: rename_field } } diff --git a/modules/search-pipeline-common/src/yamlRestTest/resources/rest-api-spec/test/search_pipeline/50_script_processor.yml b/modules/search-pipeline-common/src/yamlRestTest/resources/rest-api-spec/test/search_pipeline/50_script_processor.yml new file mode 100644 index 0000000000000..bba52285fd58d --- /dev/null +++ b/modules/search-pipeline-common/src/yamlRestTest/resources/rest-api-spec/test/search_pipeline/50_script_processor.yml @@ -0,0 +1,96 @@ +--- +teardown: + - do: + search_pipeline.delete: + id: "my_pipeline" + ignore: 404 + +--- +"Test empty script in script processor": + - do: + catch: bad_request + search_pipeline.put: + id: "my_pipeline" + body: > + { + "description": "_description", + "request_processors": [ + { + "script" : { + "lang": "painless", + "source" : "" + } + } + ] + } + + - match: { status: 400 } + - match: { error.root_cause.0.type: "script_exception" } + +--- +"Test supported search source builder fields": + - do: + search_pipeline.put: + id: "my_pipeline" + body: > + { + "description": "_description", + "request_processors": [ + { + "script" : { + "lang" : "painless", + "source" : "ctx._source['size'] += 10; ctx._source['from'] -= 1; ctx._source['explain'] = !ctx._source['explain']; ctx._source['version'] = !ctx._source['version']; ctx._source['seq_no_primary_term'] = !ctx._source['seq_no_primary_term']; ctx._source['track_scores'] = !ctx._source['track_scores']; ctx._source['track_total_hits'] = 1; ctx._source['min_score'] -= 0.9; ctx._source['terminate_after'] += 2; ctx._source['profile'] = !ctx._source['profile'];" + } + } + ] + } + - match: { acknowledged: true } + + - do: + index: + index: test + id: 1 + body: { + "field": 2 + } + - do: + index: + index: test + id: 2 + body: { + "field": 3 + } + + - do: + indices.refresh: + index: test + + - do: + search: + index: test + search_pipeline: "my_pipeline" + body: { + "from": 1, + "size": 1, + "explain": true, + "version": true, + "seq_no_primary_term": true, + "track_scores": true, + "track_total_hits": true, + "min_score": 1.0, + "timeout": "60s", + "terminate_after": 2, + "profile": true + } + - length: { hits.hits: 2 } + - match: { _shards.total: 1 } + - match: { hits.total.value: 1 } + - match: { hits.hits.0._score: 1.0 } + - match: { hits.hits.1._score: 1.0 } + - is_false: hits.hits.0._explanation + - is_false: hits.hits.1._explanation + - is_false: hits.hits.0._seq_no + - is_false: hits.hits.1._seq_no + - is_false: hits.hits.0._primary_term + - is_false: hits.hits.1._primary_term + - is_false: profile diff --git a/plugins/discovery-azure-classic/build.gradle b/plugins/discovery-azure-classic/build.gradle index c88d19f0e2806..00953141b51e1 100644 --- a/plugins/discovery-azure-classic/build.gradle +++ b/plugins/discovery-azure-classic/build.gradle @@ -53,7 +53,7 @@ dependencies { api "org.apache.logging.log4j:log4j-1.2-api:${versions.log4j}" api "commons-codec:commons-codec:${versions.commonscodec}" api "commons-lang:commons-lang:2.6" - api "commons-io:commons-io:2.11.0" + api "commons-io:commons-io:2.12.0" api 'javax.mail:mail:1.4.7' api 'javax.inject:javax.inject:1' api "com.sun.jersey:jersey-client:${versions.jersey}" diff --git a/plugins/discovery-azure-classic/licenses/commons-io-2.11.0.jar.sha1 b/plugins/discovery-azure-classic/licenses/commons-io-2.11.0.jar.sha1 deleted file mode 100644 index 8adec30bade49..0000000000000 --- a/plugins/discovery-azure-classic/licenses/commons-io-2.11.0.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -a2503f302b11ebde7ebc3df41daebe0e4eea3689 \ No newline at end of file diff --git a/plugins/discovery-azure-classic/licenses/commons-io-2.12.0.jar.sha1 b/plugins/discovery-azure-classic/licenses/commons-io-2.12.0.jar.sha1 new file mode 100644 index 0000000000000..5225b130fb817 --- /dev/null +++ b/plugins/discovery-azure-classic/licenses/commons-io-2.12.0.jar.sha1 @@ -0,0 +1 @@ +e5e3eb2ff05b494287f51476bc715161412c525f \ No newline at end of file diff --git a/plugins/ingest-attachment/build.gradle b/plugins/ingest-attachment/build.gradle index 4ca580ba3620f..fe4b0fb4e90ee 100644 --- a/plugins/ingest-attachment/build.gradle +++ b/plugins/ingest-attachment/build.gradle @@ -57,7 +57,7 @@ dependencies { runtimeOnly "com.google.guava:guava:${versions.guava}" // Other dependencies api 'org.tukaani:xz:1.9' - api 'commons-io:commons-io:2.11.0' + api 'commons-io:commons-io:2.12.0' api "org.slf4j:slf4j-api:${versions.slf4j}" // character set detection diff --git a/plugins/ingest-attachment/licenses/commons-io-2.11.0.jar.sha1 b/plugins/ingest-attachment/licenses/commons-io-2.11.0.jar.sha1 deleted file mode 100644 index 8adec30bade49..0000000000000 --- a/plugins/ingest-attachment/licenses/commons-io-2.11.0.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -a2503f302b11ebde7ebc3df41daebe0e4eea3689 \ No newline at end of file diff --git a/plugins/ingest-attachment/licenses/commons-io-2.12.0.jar.sha1 b/plugins/ingest-attachment/licenses/commons-io-2.12.0.jar.sha1 new file mode 100644 index 0000000000000..5225b130fb817 --- /dev/null +++ b/plugins/ingest-attachment/licenses/commons-io-2.12.0.jar.sha1 @@ -0,0 +1 @@ +e5e3eb2ff05b494287f51476bc715161412c525f \ No newline at end of file diff --git a/plugins/repository-azure/build.gradle b/plugins/repository-azure/build.gradle index 7b18facadcb30..5478a36fd5885 100644 --- a/plugins/repository-azure/build.gradle +++ b/plugins/repository-azure/build.gradle @@ -59,7 +59,7 @@ dependencies { api 'org.reactivestreams:reactive-streams:1.0.4' api 'io.projectreactor:reactor-core:3.5.1' api 'io.projectreactor.netty:reactor-netty:1.1.4' - api 'io.projectreactor.netty:reactor-netty-core:1.1.5' + api 'io.projectreactor.netty:reactor-netty-core:1.1.7' api 'io.projectreactor.netty:reactor-netty-http:1.1.4' api "org.slf4j:slf4j-api:${versions.slf4j}" api "com.fasterxml.jackson.core:jackson-annotations:${versions.jackson}" diff --git a/plugins/repository-azure/licenses/reactor-netty-core-1.1.5.jar.sha1 b/plugins/repository-azure/licenses/reactor-netty-core-1.1.5.jar.sha1 deleted file mode 100644 index 93120d7bfc2e1..0000000000000 --- a/plugins/repository-azure/licenses/reactor-netty-core-1.1.5.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -b3de902598436fba650e3213b2b7b9505270307b \ No newline at end of file diff --git a/plugins/repository-azure/licenses/reactor-netty-core-1.1.7.jar.sha1 b/plugins/repository-azure/licenses/reactor-netty-core-1.1.7.jar.sha1 new file mode 100644 index 0000000000000..62ed795cb11e9 --- /dev/null +++ b/plugins/repository-azure/licenses/reactor-netty-core-1.1.7.jar.sha1 @@ -0,0 +1 @@ +d38bb526a501f52c4476b03730c710a96f8fd35b \ No newline at end of file diff --git a/plugins/repository-gcs/build.gradle b/plugins/repository-gcs/build.gradle index 33934d0369f48..97556eacd1552 100644 --- a/plugins/repository-gcs/build.gradle +++ b/plugins/repository-gcs/build.gradle @@ -63,7 +63,7 @@ dependencies { api "org.apache.logging.log4j:log4j-1.2-api:${versions.log4j}" api "commons-codec:commons-codec:${versions.commonscodec}" api 'com.google.api:api-common:1.8.1' - api 'com.google.api:gax:2.17.0' + api 'com.google.api:gax:2.27.0' api 'org.threeten:threetenbp:1.4.4' api 'com.google.code.gson:gson:2.9.0' api 'com.google.api.grpc:proto-google-common-protos:2.10.0' @@ -141,7 +141,6 @@ thirdPartyAudit { 'com.google.appengine.api.urlfetch.HTTPResponse', 'com.google.appengine.api.urlfetch.URLFetchService', 'com.google.appengine.api.urlfetch.URLFetchServiceFactory', - 'com.oracle.svm.core.configure.ResourcesRegistry', 'com.google.protobuf.util.JsonFormat', 'com.google.protobuf.util.JsonFormat$Parser', 'com.google.protobuf.util.JsonFormat$Printer', @@ -193,13 +192,11 @@ thirdPartyAudit { 'org.apache.http.protocol.HttpContext', 'org.apache.http.protocol.HttpProcessor', 'org.apache.http.protocol.HttpRequestExecutor', - 'org.graalvm.nativeimage.ImageSingletons', 'org.graalvm.nativeimage.hosted.Feature', 'org.graalvm.nativeimage.hosted.Feature$BeforeAnalysisAccess', 'org.graalvm.nativeimage.hosted.Feature$DuringAnalysisAccess', 'org.graalvm.nativeimage.hosted.Feature$FeatureAccess', 'org.graalvm.nativeimage.hosted.RuntimeReflection', - 'org.graalvm.nativeimage.impl.ConfigurationCondition', // commons-logging provided dependencies 'javax.jms.Message', 'javax.servlet.ServletContextEvent', diff --git a/plugins/repository-gcs/licenses/gax-2.17.0.jar.sha1 b/plugins/repository-gcs/licenses/gax-2.17.0.jar.sha1 deleted file mode 100644 index 37b5e6adf8ad7..0000000000000 --- a/plugins/repository-gcs/licenses/gax-2.17.0.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -f78b89e5c00c7a88c8d874a257f752094ea16321 \ No newline at end of file diff --git a/plugins/repository-gcs/licenses/gax-2.27.0.jar.sha1 b/plugins/repository-gcs/licenses/gax-2.27.0.jar.sha1 new file mode 100644 index 0000000000000..1813a3aa94404 --- /dev/null +++ b/plugins/repository-gcs/licenses/gax-2.27.0.jar.sha1 @@ -0,0 +1 @@ +04a27757c9240da71f896be39f47aaa6e23ef989 \ No newline at end of file diff --git a/plugins/repository-hdfs/build.gradle b/plugins/repository-hdfs/build.gradle index e2f95fec52d43..f106061f8b91c 100644 --- a/plugins/repository-hdfs/build.gradle +++ b/plugins/repository-hdfs/build.gradle @@ -75,7 +75,7 @@ dependencies { api 'commons-collections:commons-collections:3.2.2' api 'org.apache.commons:commons-compress:1.23.0' api 'org.apache.commons:commons-configuration2:2.9.0' - api 'commons-io:commons-io:2.11.0' + api 'commons-io:commons-io:2.12.0' api 'org.apache.commons:commons-lang3:3.12.0' implementation 'com.google.re2j:re2j:1.7' api 'javax.servlet:servlet-api:2.5' diff --git a/plugins/repository-hdfs/licenses/commons-io-2.11.0.jar.sha1 b/plugins/repository-hdfs/licenses/commons-io-2.11.0.jar.sha1 deleted file mode 100644 index 8adec30bade49..0000000000000 --- a/plugins/repository-hdfs/licenses/commons-io-2.11.0.jar.sha1 +++ /dev/null @@ -1 +0,0 @@ -a2503f302b11ebde7ebc3df41daebe0e4eea3689 \ No newline at end of file diff --git a/plugins/repository-hdfs/licenses/commons-io-2.12.0.jar.sha1 b/plugins/repository-hdfs/licenses/commons-io-2.12.0.jar.sha1 new file mode 100644 index 0000000000000..5225b130fb817 --- /dev/null +++ b/plugins/repository-hdfs/licenses/commons-io-2.12.0.jar.sha1 @@ -0,0 +1 @@ +e5e3eb2ff05b494287f51476bc715161412c525f \ No newline at end of file diff --git a/server/src/internalClusterTest/java/org/opensearch/action/admin/cluster/node/tasks/TasksIT.java b/server/src/internalClusterTest/java/org/opensearch/action/admin/cluster/node/tasks/TasksIT.java index 4d297555c3acc..67e52529ae86b 100644 --- a/server/src/internalClusterTest/java/org/opensearch/action/admin/cluster/node/tasks/TasksIT.java +++ b/server/src/internalClusterTest/java/org/opensearch/action/admin/cluster/node/tasks/TasksIT.java @@ -511,6 +511,17 @@ public void testTasksCancellation() throws Exception { .get(); assertEquals(1, cancelTasksResponse.getTasks().size()); + // Tasks are marked as cancelled at this point but not yet completed. + List taskInfoList = client().admin() + .cluster() + .prepareListTasks() + .setActions(TestTaskPlugin.TestTaskAction.NAME + "*") + .get() + .getTasks(); + for (TaskInfo taskInfo : taskInfoList) { + assertTrue(taskInfo.isCancelled()); + assertNotNull(taskInfo.getCancellationStartTime()); + } future.get(); logger.info("--> checking that test tasks are not running"); diff --git a/server/src/internalClusterTest/java/org/opensearch/remotestore/SegmentReplicationRemoteStoreIT.java b/server/src/internalClusterTest/java/org/opensearch/remotestore/SegmentReplicationRemoteStoreIT.java index 69367da9ac557..055f997fbe197 100644 --- a/server/src/internalClusterTest/java/org/opensearch/remotestore/SegmentReplicationRemoteStoreIT.java +++ b/server/src/internalClusterTest/java/org/opensearch/remotestore/SegmentReplicationRemoteStoreIT.java @@ -8,6 +8,7 @@ package org.opensearch.remotestore; +import org.apache.lucene.tests.util.LuceneTestCase; import org.junit.After; import org.junit.Before; import org.opensearch.cluster.metadata.IndexMetadata; @@ -25,6 +26,7 @@ * This makes sure that the constructs/flows that are being tested with Segment Replication, holds true after enabling * remote store. */ +@LuceneTestCase.AwaitsFix(bugUrl = "https://github.com/opensearch-project/OpenSearch/issues/7643") @OpenSearchIntegTestCase.ClusterScope(scope = OpenSearchIntegTestCase.Scope.TEST, numDataNodes = 0) public class SegmentReplicationRemoteStoreIT extends SegmentReplicationIT { diff --git a/server/src/internalClusterTest/java/org/opensearch/snapshots/SearchableSnapshotIT.java b/server/src/internalClusterTest/java/org/opensearch/snapshots/SearchableSnapshotIT.java index e9021dd8ed9ef..bf9ea4a3a781f 100644 --- a/server/src/internalClusterTest/java/org/opensearch/snapshots/SearchableSnapshotIT.java +++ b/server/src/internalClusterTest/java/org/opensearch/snapshots/SearchableSnapshotIT.java @@ -10,6 +10,8 @@ import org.opensearch.action.admin.cluster.node.stats.NodeStats; import org.opensearch.action.admin.cluster.node.stats.NodesStatsRequest; import org.opensearch.action.admin.cluster.node.stats.NodesStatsResponse; +import org.opensearch.action.admin.cluster.shards.ClusterSearchShardsGroup; +import org.opensearch.action.admin.cluster.shards.ClusterSearchShardsRequest; import org.opensearch.action.admin.cluster.snapshots.create.CreateSnapshotResponse; import org.opensearch.action.admin.cluster.snapshots.delete.DeleteSnapshotRequest; import org.opensearch.action.admin.cluster.snapshots.get.GetSnapshotsResponse; @@ -41,6 +43,7 @@ import java.io.IOException; import java.nio.file.Files; import java.nio.file.Path; +import java.util.Arrays; import java.util.List; import java.util.Map; import java.util.stream.Collectors; @@ -664,6 +667,58 @@ public void testCacheIndexFilesClearedOnDelete() throws Exception { logger.info("--> validated that the cache file path doesn't exist"); } + /** + * Test scenario that validates that the default search preference for searchable snapshot + * is primary shards + */ + public void testDefaultShardPreference() throws Exception { + final int numReplicas = 1; + final String indexName = "test-idx"; + final String restoredIndexName = indexName + "-copy"; + final String repoName = "test-repo"; + final String snapshotName = "test-snap"; + final Client client = client(); + + // Create an index, snapshot and restore as a searchable snapshot index + internalCluster().ensureAtLeastNumSearchAndDataNodes(numReplicas + 1); + createIndexWithDocsAndEnsureGreen(numReplicas, 100, indexName); + createRepositoryWithSettings(null, repoName); + takeSnapshot(client, snapshotName, repoName, indexName); + restoreSnapshotAndEnsureGreen(client, snapshotName, repoName); + assertDocCount(restoredIndexName, 100L); + assertRemoteSnapshotIndexSettings(client, restoredIndexName); + + // ClusterSearchShards API returns a list of shards that will be used + // when querying a particular index + ClusterSearchShardsGroup[] shardGroups = client.admin() + .cluster() + .searchShards(new ClusterSearchShardsRequest(restoredIndexName)) + .actionGet() + .getGroups(); + + // Ensure when no preferences are set (default preference), the only compatible shards are primary + for (ClusterSearchShardsGroup shardsGroup : shardGroups) { + assertEquals(1, shardsGroup.getShards().length); + assertTrue(shardsGroup.getShards()[0].primary()); + } + + // Ensure when preferences are set, all the compatible shards are returned + shardGroups = client.admin() + .cluster() + .searchShards(new ClusterSearchShardsRequest(restoredIndexName).preference("foo")) + .actionGet() + .getGroups(); + + // Ensures that the compatible shards are not just primaries + for (ClusterSearchShardsGroup shardsGroup : shardGroups) { + assertTrue(shardsGroup.getShards().length > 1); + boolean containsReplica = Arrays.stream(shardsGroup.getShards()) + .map(shardRouting -> !shardRouting.primary()) + .reduce(false, (s1, s2) -> s1 || s2); + assertTrue(containsReplica); + } + } + /** * Asserts the cache folder count to match the number of shards and the number of indices within the cache folder * as provided. diff --git a/server/src/main/java/org/opensearch/cluster/routing/OperationRouting.java b/server/src/main/java/org/opensearch/cluster/routing/OperationRouting.java index 9f1f4f5622a57..ade2cda797334 100644 --- a/server/src/main/java/org/opensearch/cluster/routing/OperationRouting.java +++ b/server/src/main/java/org/opensearch/cluster/routing/OperationRouting.java @@ -42,6 +42,7 @@ import org.opensearch.common.settings.Setting; import org.opensearch.common.settings.Settings; import org.opensearch.core.common.Strings; +import org.opensearch.index.IndexModule; import org.opensearch.index.IndexNotFoundException; import org.opensearch.index.shard.ShardId; import org.opensearch.node.ResponseCollectorService; @@ -238,6 +239,13 @@ public GroupShardsIterator searchShards( final Set shards = computeTargetedShards(clusterState, concreteIndices, routing); final Set set = new HashSet<>(shards.size()); for (IndexShardRoutingTable shard : shards) { + IndexMetadata indexMetadataForShard = indexMetadata(clusterState, shard.shardId.getIndex().getName()); + if (IndexModule.Type.REMOTE_SNAPSHOT.match( + indexMetadataForShard.getSettings().get(IndexModule.INDEX_STORE_TYPE_SETTING.getKey()) + ) && (preference == null || preference.isEmpty())) { + preference = Preference.PRIMARY.type(); + } + ShardIterator iterator = preferenceActiveShardIterator( shard, clusterState.nodes().getLocalNodeId(), diff --git a/server/src/main/java/org/opensearch/common/lucene/Lucene.java b/server/src/main/java/org/opensearch/common/lucene/Lucene.java index 28549544db1ba..b69a429577c0f 100644 --- a/server/src/main/java/org/opensearch/common/lucene/Lucene.java +++ b/server/src/main/java/org/opensearch/common/lucene/Lucene.java @@ -55,7 +55,6 @@ import org.apache.lucene.index.SegmentCommitInfo; import org.apache.lucene.index.SegmentInfos; import org.apache.lucene.index.SegmentReader; -import org.apache.lucene.index.StandardDirectoryReader; import org.apache.lucene.search.DocIdSetIterator; import org.apache.lucene.search.Explanation; import org.apache.lucene.search.FieldDoc; @@ -138,21 +137,12 @@ public static SegmentInfos readSegmentInfos(Directory directory) throws IOExcept /** * A variant of {@link #readSegmentInfos(Directory)} that supports reading indices written by - * older major versions of Lucene. The underlying implementation is a workaround since the - * "expert" readLatestCommit API is currently package-private in Lucene. First, all commits in - * the given {@link Directory} are listed - this result includes older Lucene commits. Then, - * the latest index commit is opened via {@link DirectoryReader} by including a minimum supported - * Lucene major version based on the minimum compatibility of the given {@link org.opensearch.Version}. + * older major versions of Lucene. This leverages Lucene's "expert" readLatestCommit API. The + * {@link org.opensearch.Version} parameter determines the minimum supported Lucene major version. */ - public static SegmentInfos readSegmentInfosExtendedCompatibility(Directory directory, org.opensearch.Version minimumVersion) - throws IOException { - // This list is sorted from oldest to latest - List indexCommits = DirectoryReader.listCommits(directory); - IndexCommit latestCommit = indexCommits.get(indexCommits.size() - 1); + public static SegmentInfos readSegmentInfos(Directory directory, org.opensearch.Version minimumVersion) throws IOException { final int minSupportedLuceneMajor = minimumVersion.minimumIndexCompatibilityVersion().luceneVersion.major; - try (StandardDirectoryReader reader = (StandardDirectoryReader) DirectoryReader.open(latestCommit, minSupportedLuceneMajor, null)) { - return reader.getSegmentInfos(); - } + return SegmentInfos.readLatestCommit(directory, minSupportedLuceneMajor); } /** diff --git a/server/src/main/java/org/opensearch/extensions/ExtensionsManager.java b/server/src/main/java/org/opensearch/extensions/ExtensionsManager.java index 337dbdd06e1af..2878aa047c667 100644 --- a/server/src/main/java/org/opensearch/extensions/ExtensionsManager.java +++ b/server/src/main/java/org/opensearch/extensions/ExtensionsManager.java @@ -11,7 +11,6 @@ import java.io.IOException; import java.io.InputStream; import java.net.InetAddress; -import java.net.UnknownHostException; import java.nio.file.Files; import java.nio.file.Path; import java.util.ArrayList; @@ -37,7 +36,6 @@ import org.opensearch.action.admin.cluster.state.ClusterStateResponse; import org.opensearch.client.node.NodeClient; import org.opensearch.cluster.ClusterSettingsResponse; -import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; import org.opensearch.core.util.FileSystemUtils; import org.opensearch.common.io.stream.StreamInput; @@ -58,12 +56,6 @@ import org.opensearch.extensions.rest.RestActionsRequestHandler; import org.opensearch.extensions.settings.CustomSettingsRequestHandler; import org.opensearch.extensions.settings.RegisterCustomSettingsRequest; -import org.opensearch.index.IndexModule; -import org.opensearch.index.IndexService; -import org.opensearch.index.IndicesModuleRequest; -import org.opensearch.index.IndicesModuleResponse; -import org.opensearch.index.shard.IndexEventListener; -import org.opensearch.indices.cluster.IndicesClusterStateService; import org.opensearch.threadpool.ThreadPool; import org.opensearch.transport.ConnectTransportException; import org.opensearch.transport.TransportException; @@ -80,8 +72,6 @@ */ public class ExtensionsManager { public static final String REQUEST_EXTENSION_ACTION_NAME = "internal:discovery/extensions"; - public static final String INDICES_EXTENSION_POINT_ACTION_NAME = "indices:internal/extensions"; - public static final String INDICES_EXTENSION_NAME_ACTION_NAME = "indices:internal/name"; public static final String REQUEST_EXTENSION_CLUSTER_STATE = "internal:discovery/clusterstate"; public static final String REQUEST_EXTENSION_CLUSTER_SETTINGS = "internal:discovery/clustersettings"; public static final String REQUEST_EXTENSION_ENVIRONMENT_SETTINGS = "internal:discovery/enviornmentsettings"; @@ -466,125 +456,6 @@ TransportResponse handleExtensionRequest(ExtensionRequest extensionRequest) thro } } - public void onIndexModule(IndexModule indexModule) throws UnknownHostException { - for (DiscoveryNode extensionNode : extensionIdMap.values()) { - onIndexModule(indexModule, extensionNode); - } - } - - private void onIndexModule(IndexModule indexModule, DiscoveryNode extensionNode) throws UnknownHostException { - logger.info("onIndexModule index:" + indexModule.getIndex()); - final CompletableFuture inProgressFuture = new CompletableFuture<>(); - final CompletableFuture inProgressIndexNameFuture = new CompletableFuture<>(); - final TransportResponseHandler acknowledgedResponseHandler = new TransportResponseHandler< - AcknowledgedResponse>() { - @Override - public void handleResponse(AcknowledgedResponse response) { - logger.info("ACK Response" + response); - inProgressIndexNameFuture.complete(response); - } - - @Override - public void handleException(TransportException exp) { - inProgressIndexNameFuture.completeExceptionally(exp); - } - - @Override - public String executor() { - return ThreadPool.Names.GENERIC; - } - - @Override - public AcknowledgedResponse read(StreamInput in) throws IOException { - return new AcknowledgedResponse(in); - } - - }; - - final TransportResponseHandler indicesModuleResponseHandler = new TransportResponseHandler< - IndicesModuleResponse>() { - - @Override - public IndicesModuleResponse read(StreamInput in) throws IOException { - return new IndicesModuleResponse(in); - } - - @Override - public void handleResponse(IndicesModuleResponse response) { - logger.info("received {}", response); - if (response.getIndexEventListener() == true) { - indexModule.addIndexEventListener(new IndexEventListener() { - @Override - public void beforeIndexRemoved( - IndexService indexService, - IndicesClusterStateService.AllocatedIndices.IndexRemovalReason reason - ) { - logger.info("Index Event Listener is called"); - String indexName = indexService.index().getName(); - logger.info("Index Name" + indexName.toString()); - try { - logger.info("Sending extension request type: " + INDICES_EXTENSION_NAME_ACTION_NAME); - transportService.sendRequest( - extensionNode, - INDICES_EXTENSION_NAME_ACTION_NAME, - new IndicesModuleRequest(indexModule), - acknowledgedResponseHandler - ); - inProgressIndexNameFuture.orTimeout(EXTENSION_REQUEST_WAIT_TIMEOUT, TimeUnit.SECONDS).join(); - } catch (CompletionException e) { - if (e.getCause() instanceof TimeoutException) { - logger.info("No response from extension to request."); - } - if (e.getCause() instanceof RuntimeException) { - throw (RuntimeException) e.getCause(); - } else if (e.getCause() instanceof Error) { - throw (Error) e.getCause(); - } else { - throw new RuntimeException(e.getCause()); - } - } - } - }); - } - inProgressFuture.complete(response); - } - - @Override - public void handleException(TransportException exp) { - logger.error(new ParameterizedMessage("IndicesModuleRequest failed"), exp); - inProgressFuture.completeExceptionally(exp); - } - - @Override - public String executor() { - return ThreadPool.Names.GENERIC; - } - }; - - try { - logger.info("Sending extension request type: " + INDICES_EXTENSION_POINT_ACTION_NAME); - transportService.sendRequest( - extensionNode, - INDICES_EXTENSION_POINT_ACTION_NAME, - new IndicesModuleRequest(indexModule), - indicesModuleResponseHandler - ); - inProgressFuture.orTimeout(EXTENSION_REQUEST_WAIT_TIMEOUT, TimeUnit.SECONDS).join(); - logger.info("Received response from Extension"); - } catch (CompletionException e) { - if (e.getCause() instanceof TimeoutException) { - logger.info("No response from extension to request."); - } - if (e.getCause() instanceof RuntimeException) { - throw (RuntimeException) e.getCause(); - } else if (e.getCause() instanceof Error) { - throw (Error) e.getCause(); - } else { - throw new RuntimeException(e.getCause()); - } - } - } - private ExtensionsSettings readFromExtensionsYml(Path filePath) throws IOException { Yaml yaml = new Yaml(); try (InputStream inputStream = Files.newInputStream(filePath)) { @@ -655,14 +526,6 @@ static String getRequestExtensionActionName() { return REQUEST_EXTENSION_ACTION_NAME; } - static String getIndicesExtensionPointActionName() { - return INDICES_EXTENSION_POINT_ACTION_NAME; - } - - static String getIndicesExtensionNameActionName() { - return INDICES_EXTENSION_NAME_ACTION_NAME; - } - static String getRequestExtensionClusterState() { return REQUEST_EXTENSION_CLUSTER_STATE; } diff --git a/server/src/main/java/org/opensearch/extensions/NoopExtensionsManager.java b/server/src/main/java/org/opensearch/extensions/NoopExtensionsManager.java index 6165423b767ce..eb9b389b7a4b1 100644 --- a/server/src/main/java/org/opensearch/extensions/NoopExtensionsManager.java +++ b/server/src/main/java/org/opensearch/extensions/NoopExtensionsManager.java @@ -9,7 +9,6 @@ package org.opensearch.extensions; import java.io.IOException; -import java.net.UnknownHostException; import java.nio.file.Path; import java.util.Optional; @@ -22,7 +21,6 @@ import org.opensearch.extensions.action.ExtensionActionRequest; import org.opensearch.extensions.action.ExtensionActionResponse; import org.opensearch.extensions.action.RemoteExtensionActionResponse; -import org.opensearch.index.IndexModule; import org.opensearch.transport.TransportService; /** @@ -70,11 +68,6 @@ public void initialize() { // no-op } - @Override - public void onIndexModule(IndexModule indexModule) throws UnknownHostException { - // no-op - } - @Override public Optional lookupInitializedExtensionById(final String extensionId) { // no-op not found diff --git a/server/src/main/java/org/opensearch/index/engine/ReadOnlyEngine.java b/server/src/main/java/org/opensearch/index/engine/ReadOnlyEngine.java index 73ec658f573c2..3d91fb348a066 100644 --- a/server/src/main/java/org/opensearch/index/engine/ReadOnlyEngine.java +++ b/server/src/main/java/org/opensearch/index/engine/ReadOnlyEngine.java @@ -131,7 +131,7 @@ public ReadOnlyEngine( // yet this makes sure nobody else does. including some testing tools that try to be messy indexWriterLock = obtainLock ? directory.obtainLock(IndexWriter.WRITE_LOCK_NAME) : null; if (isExtendedCompatibility()) { - this.lastCommittedSegmentInfos = Lucene.readSegmentInfosExtendedCompatibility(directory, this.minimumSupportedVersion); + this.lastCommittedSegmentInfos = Lucene.readSegmentInfos(directory, this.minimumSupportedVersion); } else { this.lastCommittedSegmentInfos = Lucene.readSegmentInfos(directory); } diff --git a/server/src/main/java/org/opensearch/index/shard/IndexShard.java b/server/src/main/java/org/opensearch/index/shard/IndexShard.java index 8b542be222f25..b417133e4a89d 100644 --- a/server/src/main/java/org/opensearch/index/shard/IndexShard.java +++ b/server/src/main/java/org/opensearch/index/shard/IndexShard.java @@ -2275,9 +2275,7 @@ private boolean assertSequenceNumbersInCommit() throws IOException { private Map fetchUserData() throws IOException { if (indexSettings.isRemoteSnapshot() && indexSettings.getExtendedCompatibilitySnapshotVersion() != null) { - // Inefficient method to support reading old Lucene indexes - return Lucene.readSegmentInfosExtendedCompatibility(store.directory(), indexSettings.getExtendedCompatibilitySnapshotVersion()) - .getUserData(); + return Lucene.readSegmentInfos(store.directory(), indexSettings.getExtendedCompatibilitySnapshotVersion()).getUserData(); } else { return SegmentInfos.readLatestCommit(store.directory()).getUserData(); } diff --git a/server/src/main/java/org/opensearch/index/store/Store.java b/server/src/main/java/org/opensearch/index/store/Store.java index 9b2c661569c16..dae698b1c3b46 100644 --- a/server/src/main/java/org/opensearch/index/store/Store.java +++ b/server/src/main/java/org/opensearch/index/store/Store.java @@ -269,7 +269,7 @@ private static SegmentInfos readSegmentsInfo(IndexCommit commit, Directory direc private static SegmentInfos readSegmentInfosExtendedCompatibility(Directory directory, org.opensearch.Version minimumVersion) throws IOException { try { - return Lucene.readSegmentInfosExtendedCompatibility(directory, minimumVersion); + return Lucene.readSegmentInfos(directory, minimumVersion); } catch (EOFException eof) { // TODO this should be caught by lucene - EOF is almost certainly an index corruption throw new CorruptIndexException("Read past EOF while reading segment infos", "", eof); diff --git a/server/src/main/java/org/opensearch/indices/IndicesService.java b/server/src/main/java/org/opensearch/indices/IndicesService.java index cdad2c45638e5..58a26f813d88d 100644 --- a/server/src/main/java/org/opensearch/indices/IndicesService.java +++ b/server/src/main/java/org/opensearch/indices/IndicesService.java @@ -148,7 +148,6 @@ import org.opensearch.indices.replication.common.ReplicationType; import org.opensearch.node.Node; import org.opensearch.plugins.IndexStorePlugin; -import org.opensearch.extensions.ExtensionsManager; import org.opensearch.plugins.PluginsService; import org.opensearch.repositories.RepositoriesService; import org.opensearch.script.ScriptService; @@ -289,7 +288,6 @@ public class IndicesService extends AbstractLifecycleComponent */ private final Settings settings; private final PluginsService pluginsService; - private final ExtensionsManager extensionsManager; private final NodeEnvironment nodeEnv; private final NamedXContentRegistry xContentRegistry; private final TimeValue shardsClosedTimeout; @@ -342,7 +340,6 @@ protected void doStart() { public IndicesService( Settings settings, PluginsService pluginsService, - ExtensionsManager extensionsManager, NodeEnvironment nodeEnv, NamedXContentRegistry xContentRegistry, AnalysisRegistry analysisRegistry, @@ -368,7 +365,6 @@ public IndicesService( this.settings = settings; this.threadPool = threadPool; this.pluginsService = pluginsService; - this.extensionsManager = extensionsManager; this.nodeEnv = nodeEnv; this.xContentRegistry = xContentRegistry; this.valuesSourceRegistry = valuesSourceRegistry; @@ -810,7 +806,6 @@ private synchronized IndexService createIndexService( indexModule.addIndexOperationListener(operationListener); } pluginsService.onIndexModule(indexModule); - extensionsManager.onIndexModule(indexModule); for (IndexEventListener listener : builtInListeners) { indexModule.addIndexEventListener(listener); } diff --git a/server/src/main/java/org/opensearch/node/Node.java b/server/src/main/java/org/opensearch/node/Node.java index 3827041a60aa3..316622717c9f9 100644 --- a/server/src/main/java/org/opensearch/node/Node.java +++ b/server/src/main/java/org/opensearch/node/Node.java @@ -707,7 +707,6 @@ protected Node( final IndicesService indicesService = new IndicesService( settings, pluginsService, - extensionsManager, nodeEnvironment, xContentRegistry, analysisModule.getAnalysisRegistry(), diff --git a/server/src/main/java/org/opensearch/plugins/SearchPipelinePlugin.java b/server/src/main/java/org/opensearch/plugins/SearchPipelinePlugin.java index 8e6fbef6c8b1d..b8ceddecd3d20 100644 --- a/server/src/main/java/org/opensearch/plugins/SearchPipelinePlugin.java +++ b/server/src/main/java/org/opensearch/plugins/SearchPipelinePlugin.java @@ -9,6 +9,8 @@ package org.opensearch.plugins; import org.opensearch.search.pipeline.Processor; +import org.opensearch.search.pipeline.SearchRequestProcessor; +import org.opensearch.search.pipeline.SearchResponseProcessor; import java.util.Collections; import java.util.Map; @@ -20,13 +22,24 @@ */ public interface SearchPipelinePlugin { /** - * Returns additional search pipeline processor types added by this plugin. + * Returns additional search pipeline request processor types added by this plugin. * * The key of the returned {@link Map} is the unique name for the processor which is specified * in pipeline configurations, and the value is a {@link org.opensearch.search.pipeline.Processor.Factory} * to create the processor from a given pipeline configuration. */ - default Map getProcessors(Processor.Parameters parameters) { + default Map> getRequestProcessors(Processor.Parameters parameters) { + return Collections.emptyMap(); + } + + /** + * Returns additional search pipeline response processor types added by this plugin. + * + * The key of the returned {@link Map} is the unique name for the processor which is specified + * in pipeline configurations, and the value is a {@link org.opensearch.search.pipeline.Processor.Factory} + * to create the processor from a given pipeline configuration. + */ + default Map> getResponseProcessors(Processor.Parameters parameters) { return Collections.emptyMap(); } } diff --git a/server/src/main/java/org/opensearch/script/ScriptModule.java b/server/src/main/java/org/opensearch/script/ScriptModule.java index b5527f6d8d07d..a192e9553016b 100644 --- a/server/src/main/java/org/opensearch/script/ScriptModule.java +++ b/server/src/main/java/org/opensearch/script/ScriptModule.java @@ -68,6 +68,7 @@ public class ScriptModule { SignificantTermsHeuristicScoreScript.CONTEXT, IngestScript.CONTEXT, IngestConditionalScript.CONTEXT, + SearchScript.CONTEXT, FilterScript.CONTEXT, SimilarityScript.CONTEXT, SimilarityWeightScript.CONTEXT, diff --git a/server/src/main/java/org/opensearch/script/SearchScript.java b/server/src/main/java/org/opensearch/script/SearchScript.java new file mode 100644 index 0000000000000..f40cc8ef080a4 --- /dev/null +++ b/server/src/main/java/org/opensearch/script/SearchScript.java @@ -0,0 +1,55 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ + +package org.opensearch.script; + +import org.opensearch.common.unit.TimeValue; + +import java.util.Map; + +/** + * A script used by the Search Script Processor. + * + * @opensearch.internal + */ +public abstract class SearchScript { + + public static final String[] PARAMETERS = { "ctx" }; + + /** The context used to compile {@link SearchScript} factories. */ + public static final ScriptContext CONTEXT = new ScriptContext<>( + "search", + Factory.class, + 200, + TimeValue.timeValueMillis(0), + ScriptCache.UNLIMITED_COMPILATION_RATE.asTuple() + ); + + /** The generic runtime parameters for the script. */ + private final Map params; + + public SearchScript(Map params) { + this.params = params; + } + + /** Return the parameters for this script. */ + public Map getParams() { + return params; + } + + public abstract void execute(Map ctx); + + /** + * Factory for search script + * + * @opensearch.internal + */ + public interface Factory { + SearchScript newInstance(Map params); + } +} diff --git a/server/src/main/java/org/opensearch/search/DefaultSearchContext.java b/server/src/main/java/org/opensearch/search/DefaultSearchContext.java index 1bde741973983..40081c087f09a 100644 --- a/server/src/main/java/org/opensearch/search/DefaultSearchContext.java +++ b/server/src/main/java/org/opensearch/search/DefaultSearchContext.java @@ -89,7 +89,6 @@ import org.opensearch.search.rescore.RescoreContext; import org.opensearch.search.slice.SliceBuilder; import org.opensearch.search.sort.SortAndFormats; -import org.opensearch.search.sort.SortOrder; import org.opensearch.search.suggest.SuggestionSearchContext; import java.io.IOException; @@ -212,7 +211,7 @@ final class DefaultSearchContext extends SearchContext { engineSearcher.getQueryCachingPolicy(), lowLevelCancellation, executor, - shouldReverseLeafReaderContexts() + this ); this.relativeTimeSupplier = relativeTimeSupplier; this.timeout = timeout; @@ -887,22 +886,4 @@ public boolean isCancelled() { public ReaderContext readerContext() { return readerContext; } - - private boolean shouldReverseLeafReaderContexts() { - // Time series based workload by default traverses segments in desc order i.e. latest to the oldest order. - // This is actually beneficial for search queries to start search on latest segments first for time series workload. - // That can slow down ASC order queries on timestamp workload. So to avoid that slowdown, we will reverse leaf - // reader order here. - if (this.indexShard.isTimeSeriesIndex()) { - // Only reverse order for asc order sort queries - if (request != null - && request.source() != null - && request.source().sorts() != null - && request.source().sorts().size() > 0 - && request.source().sorts().get(0).order() == SortOrder.ASC) { - return true; - } - } - return false; - } } diff --git a/server/src/main/java/org/opensearch/search/SearchService.java b/server/src/main/java/org/opensearch/search/SearchService.java index cdd30a2a8847d..efb5800879495 100644 --- a/server/src/main/java/org/opensearch/search/SearchService.java +++ b/server/src/main/java/org/opensearch/search/SearchService.java @@ -133,6 +133,7 @@ import org.opensearch.search.sort.MinAndMax; import org.opensearch.search.sort.SortAndFormats; import org.opensearch.search.sort.SortBuilder; +import org.opensearch.search.sort.SortOrder; import org.opensearch.search.suggest.Suggest; import org.opensearch.search.suggest.completion.CompletionSuggestion; import org.opensearch.threadpool.Scheduler.Cancellable; @@ -1525,7 +1526,7 @@ private CanMatchResponse canMatch(ShardSearchRequest request, boolean checkRefre final boolean aliasFilterCanMatch = request.getAliasFilter().getQueryBuilder() instanceof MatchNoneQueryBuilder == false; FieldSortBuilder sortBuilder = FieldSortBuilder.getPrimaryFieldSortOrNull(request.source()); MinAndMax minMax = sortBuilder != null ? FieldSortBuilder.getMinMaxOrNull(context, sortBuilder) : null; - final boolean canMatch; + boolean canMatch; if (canRewriteToMatchNone(request.source())) { QueryBuilder queryBuilder = request.source().query(); canMatch = aliasFilterCanMatch && queryBuilder instanceof MatchNoneQueryBuilder == false; @@ -1533,11 +1534,44 @@ private CanMatchResponse canMatch(ShardSearchRequest request, boolean checkRefre // null query means match_all canMatch = aliasFilterCanMatch; } + final FieldDoc searchAfterFieldDoc = getSearchAfterFieldDoc(request, context); + canMatch = canMatch && canMatchSearchAfter(searchAfterFieldDoc, minMax, sortBuilder); + return new CanMatchResponse(canMatch || hasRefreshPending, minMax); } } } + public static boolean canMatchSearchAfter(FieldDoc searchAfter, MinAndMax minMax, FieldSortBuilder primarySortField) { + if (searchAfter != null && minMax != null && primarySortField != null) { + final Object searchAfterPrimary = searchAfter.fields[0]; + if (primarySortField.order() == SortOrder.DESC) { + if (minMax.compareMin(searchAfterPrimary) > 0) { + // In Desc order, if segment/shard minimum is gt search_after, the segment/shard won't be competitive + return false; + } + } else { + if (minMax.compareMax(searchAfterPrimary) < 0) { + // In ASC order, if segment/shard maximum is lt search_after, the segment/shard won't be competitive + return false; + } + } + } + return true; + } + + private static FieldDoc getSearchAfterFieldDoc(ShardSearchRequest request, QueryShardContext context) throws IOException { + if (context != null && request != null && request.source() != null && request.source().sorts() != null) { + final List> sorts = request.source().sorts(); + final Object[] searchAfter = request.source().searchAfter(); + final Optional sortOpt = SortBuilder.buildSort(sorts, context); + if (sortOpt.isPresent() && !CollectionUtils.isEmpty(searchAfter)) { + return SearchAfterBuilder.buildFieldDoc(sortOpt.get(), searchAfter); + } + } + return null; + } + /** * Returns true iff the given search source builder can be early terminated by rewriting to a match none query. Or in other words * if the execution of the search request can be early terminated without executing it. This is for instance not possible if diff --git a/server/src/main/java/org/opensearch/search/internal/ContextIndexSearcher.java b/server/src/main/java/org/opensearch/search/internal/ContextIndexSearcher.java index 818da075fd7ec..79734b1e25005 100644 --- a/server/src/main/java/org/opensearch/search/internal/ContextIndexSearcher.java +++ b/server/src/main/java/org/opensearch/search/internal/ContextIndexSearcher.java @@ -65,6 +65,7 @@ import org.opensearch.common.lease.Releasable; import org.opensearch.common.lucene.search.TopDocsAndMaxScore; import org.opensearch.search.DocValueFormat; +import org.opensearch.search.SearchService; import org.opensearch.search.dfs.AggregatedDfs; import org.opensearch.search.profile.ContextualProfileBreakdown; import org.opensearch.search.profile.Timer; @@ -72,6 +73,9 @@ import org.opensearch.search.profile.query.QueryProfiler; import org.opensearch.search.profile.query.QueryTimingType; import org.opensearch.search.query.QuerySearchResult; +import org.opensearch.search.sort.FieldSortBuilder; +import org.opensearch.search.sort.MinAndMax; +import org.opensearch.search.sort.SortOrder; import java.io.IOException; import java.util.ArrayList; @@ -97,12 +101,7 @@ public class ContextIndexSearcher extends IndexSearcher implements Releasable { private AggregatedDfs aggregatedDfs; private QueryProfiler profiler; private MutableQueryTimeout cancellable; - - /** - * Certain queries can benefit if we reverse the segment read order, - * for example time series based queries if searched for desc sort order - */ - private final boolean reverseLeafReaderContexts; + private SearchContext searchContext; public ContextIndexSearcher( IndexReader reader, @@ -120,7 +119,7 @@ public ContextIndexSearcher( new MutableQueryTimeout(), wrapWithExitableDirectoryReader, executor, - false + null ); } @@ -131,7 +130,7 @@ public ContextIndexSearcher( QueryCachingPolicy queryCachingPolicy, boolean wrapWithExitableDirectoryReader, Executor executor, - boolean reverseLeafReaderContexts + SearchContext searchContext ) throws IOException { this( reader, @@ -141,7 +140,7 @@ public ContextIndexSearcher( new MutableQueryTimeout(), wrapWithExitableDirectoryReader, executor, - reverseLeafReaderContexts + searchContext ); } @@ -153,14 +152,14 @@ private ContextIndexSearcher( MutableQueryTimeout cancellable, boolean wrapWithExitableDirectoryReader, Executor executor, - boolean reverseLeafReaderContexts + SearchContext searchContext ) throws IOException { super(wrapWithExitableDirectoryReader ? new ExitableDirectoryReader((DirectoryReader) reader, cancellable) : reader, executor); setSimilarity(similarity); setQueryCache(queryCache); setQueryCachingPolicy(queryCachingPolicy); this.cancellable = cancellable; - this.reverseLeafReaderContexts = reverseLeafReaderContexts; + this.searchContext = searchContext; } public void setProfiler(QueryProfiler profiler) { @@ -284,8 +283,10 @@ public void search( @Override protected void search(List leaves, Weight weight, Collector collector) throws IOException { - if (reverseLeafReaderContexts) { + if (shouldReverseLeafReaderContexts()) { // reverse the segment search order if this flag is true. + // Certain queries can benefit if we reverse the segment read order, + // for example time series based queries if searched for desc sort order. for (int i = leaves.size() - 1; i >= 0; i--) { searchLeaf(leaves.get(i), weight, collector); } @@ -303,6 +304,12 @@ protected void search(List leaves, Weight weight, Collector c * the provided ctx. */ private void searchLeaf(LeafReaderContext ctx, Weight weight, Collector collector) throws IOException { + + // Check if at all we need to call this leaf for collecting results. + if (canMatch(ctx) == false) { + return; + } + cancellable.checkCancelled(); weight = wrapWeight(weight); // See please https://github.com/apache/lucene/pull/964 @@ -478,4 +485,43 @@ public void clear() { runnables.clear(); } } + + private boolean canMatch(LeafReaderContext ctx) throws IOException { + // skip segments for search after if min/max of them doesn't qualify competitive + return canMatchSearchAfter(ctx); + } + + private boolean canMatchSearchAfter(LeafReaderContext ctx) throws IOException { + if (searchContext != null && searchContext.request() != null && searchContext.request().source() != null) { + // Only applied on primary sort field and primary search_after. + FieldSortBuilder primarySortField = FieldSortBuilder.getPrimaryFieldSortOrNull(searchContext.request().source()); + if (primarySortField != null) { + MinAndMax minMax = FieldSortBuilder.getMinMaxOrNullForSegment( + this.searchContext.getQueryShardContext(), + ctx, + primarySortField + ); + return SearchService.canMatchSearchAfter(searchContext.searchAfter(), minMax, primarySortField); + } + } + return true; + } + + private boolean shouldReverseLeafReaderContexts() { + // Time series based workload by default traverses segments in desc order i.e. latest to the oldest order. + // This is actually beneficial for search queries to start search on latest segments first for time series workload. + // That can slow down ASC order queries on timestamp workload. So to avoid that slowdown, we will reverse leaf + // reader order here. + if (searchContext != null && searchContext.indexShard().isTimeSeriesIndex()) { + // Only reverse order for asc order sort queries + if (searchContext.request() != null + && searchContext.request().source() != null + && searchContext.request().source().sorts() != null + && searchContext.request().source().sorts().size() > 0 + && searchContext.request().source().sorts().get(0).order() == SortOrder.ASC) { + return true; + } + } + return false; + } } diff --git a/server/src/main/java/org/opensearch/search/pipeline/Pipeline.java b/server/src/main/java/org/opensearch/search/pipeline/Pipeline.java index f5dce8ec728b2..c9a5f865d507e 100644 --- a/server/src/main/java/org/opensearch/search/pipeline/Pipeline.java +++ b/server/src/main/java/org/opensearch/search/pipeline/Pipeline.java @@ -46,7 +46,7 @@ class Pipeline { private final NamedWriteableRegistry namedWriteableRegistry; - Pipeline( + private Pipeline( String id, @Nullable String description, @Nullable Integer version, @@ -62,31 +62,24 @@ class Pipeline { this.namedWriteableRegistry = namedWriteableRegistry; } - public static Pipeline create( + static Pipeline create( String id, Map config, - Map processorFactories, + Map> requestProcessorFactories, + Map> responseProcessorFactories, NamedWriteableRegistry namedWriteableRegistry ) throws Exception { String description = ConfigurationUtils.readOptionalStringProperty(null, null, config, DESCRIPTION_KEY); Integer version = ConfigurationUtils.readIntProperty(null, null, config, VERSION_KEY, null); List> requestProcessorConfigs = ConfigurationUtils.readOptionalList(null, null, config, REQUEST_PROCESSORS_KEY); - List requestProcessors = readProcessors( - SearchRequestProcessor.class, - processorFactories, - requestProcessorConfigs - ); + List requestProcessors = readProcessors(requestProcessorFactories, requestProcessorConfigs); List> responseProcessorConfigs = ConfigurationUtils.readOptionalList( null, null, config, RESPONSE_PROCESSORS_KEY ); - List responseProcessors = readProcessors( - SearchResponseProcessor.class, - processorFactories, - responseProcessorConfigs - ); + List responseProcessors = readProcessors(responseProcessorFactories, responseProcessorConfigs); if (config.isEmpty() == false) { throw new OpenSearchParseException( "pipeline [" @@ -98,10 +91,8 @@ public static Pipeline create( return new Pipeline(id, description, version, requestProcessors, responseProcessors, namedWriteableRegistry); } - @SuppressWarnings("unchecked") // Cast is checked using isInstance private static List readProcessors( - Class processorType, - Map processorFactories, + Map> processorFactories, List> requestProcessorConfigs ) throws Exception { List processors = new ArrayList<>(); @@ -117,22 +108,10 @@ private static List readProcessors( Map config = (Map) entry.getValue(); String tag = ConfigurationUtils.readOptionalStringProperty(null, null, config, TAG_KEY); String description = ConfigurationUtils.readOptionalStringProperty(null, tag, config, DESCRIPTION_KEY); - Processor processor = processorFactories.get(type).create(processorFactories, tag, description, config); - if (processorType.isInstance(processor)) { - processors.add((T) processor); - } else { - throw new IllegalArgumentException("Processor type " + type + " is not a " + processorType.getSimpleName()); - } + processors.add(processorFactories.get(type).create(processorFactories, tag, description, config)); } } - return processors; - } - - List flattenAllProcessors() { - List allProcessors = new ArrayList<>(searchRequestProcessors.size() + searchResponseProcessors.size()); - allProcessors.addAll(searchRequestProcessors); - allProcessors.addAll(searchResponseProcessors); - return allProcessors; + return Collections.unmodifiableList(processors); } String getId() { diff --git a/server/src/main/java/org/opensearch/search/pipeline/Processor.java b/server/src/main/java/org/opensearch/search/pipeline/Processor.java index 44f268242b83c..ee28db1cc334d 100644 --- a/server/src/main/java/org/opensearch/search/pipeline/Processor.java +++ b/server/src/main/java/org/opensearch/search/pipeline/Processor.java @@ -52,7 +52,7 @@ public interface Processor { /** * A factory that knows how to construct a processor based on a map of maps. */ - interface Factory { + interface Factory { /** * Creates a processor based on the specified map of maps config. @@ -65,8 +65,7 @@ interface Factory { * Note: Implementations are responsible for removing the used configuration * keys, so that after creation the config map should be empty. */ - Processor create(Map processorFactories, String tag, String description, Map config) - throws Exception; + T create(Map> processorFactories, String tag, String description, Map config) throws Exception; } /** diff --git a/server/src/main/java/org/opensearch/search/pipeline/SearchPipelineInfo.java b/server/src/main/java/org/opensearch/search/pipeline/SearchPipelineInfo.java index 95d1e3720cbb3..ce38f3bfbac3e 100644 --- a/server/src/main/java/org/opensearch/search/pipeline/SearchPipelineInfo.java +++ b/server/src/main/java/org/opensearch/search/pipeline/SearchPipelineInfo.java @@ -8,15 +8,19 @@ package org.opensearch.search.pipeline; +import org.opensearch.Version; import org.opensearch.common.io.stream.StreamInput; import org.opensearch.common.io.stream.StreamOutput; import org.opensearch.core.xcontent.XContentBuilder; import org.opensearch.node.ReportingService; import java.io.IOException; +import java.util.Collections; import java.util.List; +import java.util.Map; import java.util.Objects; import java.util.Set; +import java.util.TreeMap; import java.util.TreeSet; /** @@ -26,45 +30,82 @@ */ public class SearchPipelineInfo implements ReportingService.Info { - private final Set processors; + private final Map> processors = new TreeMap<>(); - public SearchPipelineInfo(List processors) { - this.processors = new TreeSet<>(processors); // we use a treeset here to have a test-able / predictable order + public SearchPipelineInfo(Map> processors) { + for (Map.Entry> processorsEntry : processors.entrySet()) { + // we use a treeset here to have a test-able / predictable order + this.processors.put(processorsEntry.getKey(), new TreeSet<>(processorsEntry.getValue())); + } } /** * Read from a stream. */ public SearchPipelineInfo(StreamInput in) throws IOException { - processors = new TreeSet<>(); - final int size = in.readVInt(); - for (int i = 0; i < size; i++) { - processors.add(new ProcessorInfo(in)); + if (in.getVersion().before(Version.V_2_8_0)) { + // Prior to version 2.8, we had a flat list of processors. For best compatibility, assume they're valid + // request and response processor, since we couldn't tell the difference back then. + final int size = in.readVInt(); + Set processorInfos = new TreeSet<>(); + for (int i = 0; i < size; i++) { + processorInfos.add(new ProcessorInfo(in)); + } + processors.put(Pipeline.REQUEST_PROCESSORS_KEY, processorInfos); + processors.put(Pipeline.RESPONSE_PROCESSORS_KEY, processorInfos); + } else { + final int numTypes = in.readVInt(); + for (int i = 0; i < numTypes; i++) { + String type = in.readString(); + int numProcessors = in.readVInt(); + Set processorInfos = new TreeSet<>(); + for (int j = 0; j < numProcessors; j++) { + processorInfos.add(new ProcessorInfo(in)); + } + processors.put(type, processorInfos); + } } } @Override public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException { builder.startObject("search_pipelines"); - builder.startArray("processors"); - for (ProcessorInfo info : processors) { - info.toXContent(builder, params); + for (Map.Entry> processorEntry : processors.entrySet()) { + builder.startArray(processorEntry.getKey()); + for (ProcessorInfo info : processorEntry.getValue()) { + info.toXContent(builder, params); + } + builder.endArray(); } - builder.endArray(); builder.endObject(); return builder; } @Override public void writeTo(StreamOutput out) throws IOException { - out.write(processors.size()); - for (ProcessorInfo info : processors) { - info.writeTo(out); + if (out.getVersion().before(Version.V_2_8_0)) { + // Prior to version 2.8, we grouped all processors into a single list. + Set processorInfos = new TreeSet<>(); + processorInfos.addAll(processors.getOrDefault(Pipeline.REQUEST_PROCESSORS_KEY, Collections.emptySet())); + processorInfos.addAll(processors.getOrDefault(Pipeline.RESPONSE_PROCESSORS_KEY, Collections.emptySet())); + out.writeVInt(processorInfos.size()); + for (ProcessorInfo processorInfo : processorInfos) { + processorInfo.writeTo(out); + } + } else { + out.write(processors.size()); + for (Map.Entry> processorsEntry : processors.entrySet()) { + out.writeString(processorsEntry.getKey()); + out.writeVInt(processorsEntry.getValue().size()); + for (ProcessorInfo processorInfo : processorsEntry.getValue()) { + processorInfo.writeTo(out); + } + } } } - public boolean containsProcessor(String type) { - return processors.contains(new ProcessorInfo(type)); + public boolean containsProcessor(String processorType, String type) { + return processors.containsKey(processorType) && processors.get(processorType).contains(new ProcessorInfo(type)); } @Override diff --git a/server/src/main/java/org/opensearch/search/pipeline/SearchPipelineService.java b/server/src/main/java/org/opensearch/search/pipeline/SearchPipelineService.java index f96a6eb4a6b76..a486e636cbb7d 100644 --- a/server/src/main/java/org/opensearch/search/pipeline/SearchPipelineService.java +++ b/server/src/main/java/org/opensearch/search/pipeline/SearchPipelineService.java @@ -55,6 +55,8 @@ import java.util.Set; import java.util.concurrent.CopyOnWriteArrayList; import java.util.function.Consumer; +import java.util.function.Function; +import java.util.stream.Collectors; /** * The main entry point for search pipelines. Handles CRUD operations and exposes the API to execute search pipelines @@ -68,7 +70,8 @@ public class SearchPipelineService implements ClusterStateApplier, ReportingServ private static final Logger logger = LogManager.getLogger(SearchPipelineService.class); private final ClusterService clusterService; private final ScriptService scriptService; - private final Map processorFactories; + private final Map> requestProcessorFactories; + private final Map> responseProcessorFactories; private volatile Map pipelines = Collections.emptyMap(); private final ThreadPool threadPool; private final List> searchPipelineClusterStateListeners = new CopyOnWriteArrayList<>(); @@ -95,34 +98,33 @@ public SearchPipelineService( this.scriptService = scriptService; this.threadPool = threadPool; this.namedWriteableRegistry = namedWriteableRegistry; - this.processorFactories = processorFactories( - searchPipelinePlugins, - new Processor.Parameters( - env, - scriptService, - analysisRegistry, - threadPool.getThreadContext(), - threadPool::relativeTimeInMillis, - (delay, command) -> threadPool.schedule(command, TimeValue.timeValueMillis(delay), ThreadPool.Names.GENERIC), - this, - client, - threadPool.generic()::execute, - namedXContentRegistry - ) + Processor.Parameters parameters = new Processor.Parameters( + env, + scriptService, + analysisRegistry, + threadPool.getThreadContext(), + threadPool::relativeTimeInMillis, + (delay, command) -> threadPool.schedule(command, TimeValue.timeValueMillis(delay), ThreadPool.Names.GENERIC), + this, + client, + threadPool.generic()::execute, + namedXContentRegistry ); + this.requestProcessorFactories = processorFactories(searchPipelinePlugins, p -> p.getRequestProcessors(parameters)); + this.responseProcessorFactories = processorFactories(searchPipelinePlugins, p -> p.getResponseProcessors(parameters)); putPipelineTaskKey = clusterService.registerClusterManagerTask(ClusterManagerTaskKeys.PUT_SEARCH_PIPELINE_KEY, true); deletePipelineTaskKey = clusterService.registerClusterManagerTask(ClusterManagerTaskKeys.DELETE_SEARCH_PIPELINE_KEY, true); this.isEnabled = isEnabled; } - private static Map processorFactories( + private static Map> processorFactories( List searchPipelinePlugins, - Processor.Parameters parameters + Function>> processorLoader ) { - Map processorFactories = new HashMap<>(); + Map> processorFactories = new HashMap<>(); for (SearchPipelinePlugin searchPipelinePlugin : searchPipelinePlugins) { - Map newProcessors = searchPipelinePlugin.getProcessors(parameters); - for (Map.Entry entry : newProcessors.entrySet()) { + Map> newProcessors = processorLoader.apply(searchPipelinePlugin); + for (Map.Entry> entry : newProcessors.entrySet()) { if (processorFactories.put(entry.getKey(), entry.getValue()) != null) { throw new IllegalArgumentException("Search processor [" + entry.getKey() + "] is already registered"); } @@ -173,7 +175,8 @@ void innerUpdatePipelines(SearchPipelineMetadata newSearchPipelineMetadata) { Pipeline newPipeline = Pipeline.create( newConfiguration.getId(), newConfiguration.getConfigAsMap(), - processorFactories, + requestProcessorFactories, + responseProcessorFactories, namedWriteableRegistry ); newPipelines.put(newConfiguration.getId(), new PipelineHolder(newConfiguration, newPipeline)); @@ -268,12 +271,27 @@ void validatePipeline(Map searchPipelineInfos throw new IllegalStateException("Search pipeline info is empty"); } Map pipelineConfig = XContentHelper.convertToMap(request.getSource(), false, request.getXContentType()).v2(); - Pipeline pipeline = Pipeline.create(request.getId(), pipelineConfig, processorFactories, namedWriteableRegistry); + Pipeline pipeline = Pipeline.create( + request.getId(), + pipelineConfig, + requestProcessorFactories, + responseProcessorFactories, + namedWriteableRegistry + ); List exceptions = new ArrayList<>(); - for (Processor processor : pipeline.flattenAllProcessors()) { + for (SearchRequestProcessor processor : pipeline.getSearchRequestProcessors()) { + for (Map.Entry entry : searchPipelineInfos.entrySet()) { + String type = processor.getType(); + if (entry.getValue().containsProcessor(Pipeline.REQUEST_PROCESSORS_KEY, type) == false) { + String message = "Processor type [" + processor.getType() + "] is not installed on node [" + entry.getKey() + "]"; + exceptions.add(ConfigurationUtils.newConfigurationException(processor.getType(), processor.getTag(), null, message)); + } + } + } + for (SearchResponseProcessor processor : pipeline.getSearchResponseProcessors()) { for (Map.Entry entry : searchPipelineInfos.entrySet()) { String type = processor.getType(); - if (entry.getValue().containsProcessor(type) == false) { + if (entry.getValue().containsProcessor(Pipeline.RESPONSE_PROCESSORS_KEY, type) == false) { String message = "Processor type [" + processor.getType() + "] is not installed on node [" + entry.getKey() + "]"; exceptions.add(ConfigurationUtils.newConfigurationException(processor.getType(), processor.getTag(), null, message)); } @@ -352,7 +370,8 @@ public PipelinedRequest resolvePipeline(SearchRequest searchRequest) throws Exce pipeline = Pipeline.create( AD_HOC_PIPELINE_ID, searchRequest.source().searchPipelineSource(), - processorFactories, + requestProcessorFactories, + responseProcessorFactories, namedWriteableRegistry ); } catch (Exception e) { @@ -385,17 +404,27 @@ public PipelinedRequest resolvePipeline(SearchRequest searchRequest) throws Exce return new PipelinedRequest(pipeline, transformedRequest); } - Map getProcessorFactories() { - return processorFactories; + Map> getRequestProcessorFactories() { + return requestProcessorFactories; + } + + Map> getResponseProcessorFactories() { + return responseProcessorFactories; } @Override public SearchPipelineInfo info() { - List processorInfoList = new ArrayList<>(); - for (Map.Entry entry : processorFactories.entrySet()) { - processorInfoList.add(new ProcessorInfo(entry.getKey())); - } - return new SearchPipelineInfo(processorInfoList); + List requestProcessorInfoList = requestProcessorFactories.keySet() + .stream() + .map(ProcessorInfo::new) + .collect(Collectors.toList()); + List responseProcessorInfoList = responseProcessorFactories.keySet() + .stream() + .map(ProcessorInfo::new) + .collect(Collectors.toList()); + return new SearchPipelineInfo( + Map.of(Pipeline.REQUEST_PROCESSORS_KEY, requestProcessorInfoList, Pipeline.RESPONSE_PROCESSORS_KEY, responseProcessorInfoList) + ); } public static List getPipelines(ClusterState clusterState, String... ids) { diff --git a/server/src/main/java/org/opensearch/search/sort/FieldSortBuilder.java b/server/src/main/java/org/opensearch/search/sort/FieldSortBuilder.java index 8f33b51ff392d..0b7b9cd07c300 100644 --- a/server/src/main/java/org/opensearch/search/sort/FieldSortBuilder.java +++ b/server/src/main/java/org/opensearch/search/sort/FieldSortBuilder.java @@ -34,6 +34,7 @@ import org.apache.lucene.document.LongPoint; import org.apache.lucene.index.IndexReader; +import org.apache.lucene.index.LeafReaderContext; import org.apache.lucene.index.MultiTerms; import org.apache.lucene.index.PointValues; import org.apache.lucene.index.Terms; @@ -605,17 +606,31 @@ public static FieldSortBuilder getPrimaryFieldSortOrNull(SearchSourceBuilder sou } /** - * Return the {@link MinAndMax} indexed value from the provided {@link FieldSortBuilder} or null if unknown. + * Return the {@link MinAndMax} indexed value for shard from the provided {@link FieldSortBuilder} or null if unknown. * The value can be extracted on non-nested indexed mapped fields of type keyword, numeric or date, other fields * and configurations return null. */ public static MinAndMax getMinMaxOrNull(QueryShardContext context, FieldSortBuilder sortBuilder) throws IOException { + return getMinMaxOrNullInternal(context.getIndexReader(), context, sortBuilder); + } + + /** + * Return the {@link MinAndMax} indexed value for segment from the provided {@link FieldSortBuilder} or null if unknown. + * The value can be extracted on non-nested indexed mapped fields of type keyword, numeric or date, other fields + * and configurations return null. + */ + public static MinAndMax getMinMaxOrNullForSegment(QueryShardContext context, LeafReaderContext ctx, FieldSortBuilder sortBuilder) + throws IOException { + return getMinMaxOrNullInternal(ctx.reader(), context, sortBuilder); + } + + private static MinAndMax getMinMaxOrNullInternal(IndexReader reader, QueryShardContext context, FieldSortBuilder sortBuilder) + throws IOException { SortAndFormats sort = SortBuilder.buildSort(Collections.singletonList(sortBuilder), context).get(); SortField sortField = sort.sort.getSort()[0]; if (sortField.getField() == null) { return null; } - IndexReader reader = context.getIndexReader(); MappedFieldType fieldType = context.fieldMapper(sortField.getField()); if (reader == null || (fieldType == null || fieldType.isSearchable() == false)) { return null; diff --git a/server/src/main/java/org/opensearch/search/sort/MinAndMax.java b/server/src/main/java/org/opensearch/search/sort/MinAndMax.java index c5aae37f49052..7e655ca029035 100644 --- a/server/src/main/java/org/opensearch/search/sort/MinAndMax.java +++ b/server/src/main/java/org/opensearch/search/sort/MinAndMax.java @@ -32,12 +32,14 @@ package org.opensearch.search.sort; +import org.apache.lucene.util.BytesRef; import org.opensearch.common.io.stream.StreamInput; import org.opensearch.common.io.stream.StreamOutput; import org.opensearch.common.io.stream.Writeable; import org.opensearch.common.lucene.Lucene; import java.io.IOException; +import java.math.BigInteger; import java.util.Comparator; import java.util.Objects; @@ -93,4 +95,36 @@ public static Comparator> getComparator(SortOrder order) { } return Comparator.nullsLast(cmp); } + + /** + * Compare given object with min + */ + public int compareMin(Object object) { + return compare(getMin(), object); + } + + /** + * Compare given object with max + */ + public int compareMax(Object object) { + return compare(getMax(), object); + } + + private int compare(T one, Object two) { + if (one instanceof Long) { + return Long.compare((Long) one, (Long) two); + } else if (one instanceof Integer) { + return Integer.compare((Integer) one, (Integer) two); + } else if (one instanceof Float) { + return Float.compare((Float) one, (Float) two); + } else if (one instanceof Double) { + return Double.compare((Double) one, (Double) two); + } else if (one instanceof BigInteger) { + return ((BigInteger) one).compareTo((BigInteger) two); + } else if (one instanceof BytesRef) { + return ((BytesRef) one).compareTo((BytesRef) two); + } else { + throw new UnsupportedOperationException("compare type not supported : " + one.getClass()); + } + } } diff --git a/server/src/main/java/org/opensearch/tasks/CancellableTask.java b/server/src/main/java/org/opensearch/tasks/CancellableTask.java index 336f5b1f4c244..d32d04f006bbd 100644 --- a/server/src/main/java/org/opensearch/tasks/CancellableTask.java +++ b/server/src/main/java/org/opensearch/tasks/CancellableTask.java @@ -50,6 +50,14 @@ public abstract class CancellableTask extends Task { private volatile String reason; private final AtomicBoolean cancelled = new AtomicBoolean(false); private final TimeValue cancelAfterTimeInterval; + /** + * The time this task was cancelled as a wall clock time since epoch ({@link System#currentTimeMillis()} style). + */ + private Long cancellationStartTime = null; + /** + * The time this task was cancelled as a relative time ({@link System#nanoTime()} style). + */ + private Long cancellationStartTimeNanos = null; public CancellableTask(long id, String type, String action, String description, TaskId parentTaskId, Map headers) { this(id, type, action, description, parentTaskId, headers, NO_TIMEOUT); @@ -74,6 +82,8 @@ public CancellableTask( public void cancel(String reason) { assert reason != null; if (cancelled.compareAndSet(false, true)) { + this.cancellationStartTime = System.currentTimeMillis(); + this.cancellationStartTimeNanos = System.nanoTime(); this.reason = reason; onCancelled(); } @@ -87,6 +97,14 @@ public boolean cancelOnParentLeaving() { return true; } + public Long getCancellationStartTime() { + return cancellationStartTime; + } + + public Long getCancellationStartTimeNanos() { + return cancellationStartTimeNanos; + } + /** * Returns true if this task can potentially have children that need to be cancelled when it parent is cancelled. */ diff --git a/server/src/main/java/org/opensearch/tasks/Task.java b/server/src/main/java/org/opensearch/tasks/Task.java index 638c9c7ab41bd..23266dca45efa 100644 --- a/server/src/main/java/org/opensearch/tasks/Task.java +++ b/server/src/main/java/org/opensearch/tasks/Task.java @@ -192,6 +192,11 @@ protected final TaskInfo taskInfo(String localNodeId, String description, Status * Build a proper {@link TaskInfo} for this task. */ protected final TaskInfo taskInfo(String localNodeId, String description, Status status, TaskResourceStats resourceStats) { + boolean cancelled = this instanceof CancellableTask && ((CancellableTask) this).isCancelled(); + Long cancellationStartTime = null; + if (cancelled) { + cancellationStartTime = ((CancellableTask) this).getCancellationStartTime(); + } return new TaskInfo( new TaskId(localNodeId, getId()), getType(), @@ -201,10 +206,11 @@ protected final TaskInfo taskInfo(String localNodeId, String description, Status startTime, System.nanoTime() - startTimeNanos, this instanceof CancellableTask, - this instanceof CancellableTask && ((CancellableTask) this).isCancelled(), + cancelled, parentTask, headers, - resourceStats + resourceStats, + cancellationStartTime ); } diff --git a/server/src/main/java/org/opensearch/tasks/TaskInfo.java b/server/src/main/java/org/opensearch/tasks/TaskInfo.java index 63bc46f8cfca6..2e5415279d804 100644 --- a/server/src/main/java/org/opensearch/tasks/TaskInfo.java +++ b/server/src/main/java/org/opensearch/tasks/TaskInfo.java @@ -85,6 +85,8 @@ public final class TaskInfo implements Writeable, ToXContentFragment { private final boolean cancelled; + private final Long cancellationStartTime; + private final TaskId parentTaskId; private final Map headers; @@ -104,6 +106,38 @@ public TaskInfo( TaskId parentTaskId, Map headers, TaskResourceStats resourceStats + ) { + this( + taskId, + type, + action, + description, + status, + startTime, + runningTimeNanos, + cancellable, + cancelled, + parentTaskId, + headers, + resourceStats, + null + ); + } + + public TaskInfo( + TaskId taskId, + String type, + String action, + String description, + Task.Status status, + long startTime, + long runningTimeNanos, + boolean cancellable, + boolean cancelled, + TaskId parentTaskId, + Map headers, + TaskResourceStats resourceStats, + Long cancellationStartTime ) { if (cancellable == false && cancelled == true) { throw new IllegalArgumentException("task cannot be cancelled"); @@ -120,6 +154,7 @@ public TaskInfo( this.parentTaskId = parentTaskId; this.headers = headers; this.resourceStats = resourceStats; + this.cancellationStartTime = cancellationStartTime; } /** @@ -150,6 +185,11 @@ public TaskInfo(StreamInput in) throws IOException { } else { resourceStats = null; } + if (in.getVersion().onOrAfter(Version.V_3_0_0)) { + cancellationStartTime = in.readOptionalLong(); + } else { + cancellationStartTime = null; + } } @Override @@ -170,6 +210,9 @@ public void writeTo(StreamOutput out) throws IOException { if (out.getVersion().onOrAfter(Version.V_2_1_0)) { out.writeOptionalWriteable(resourceStats); } + if (out.getVersion().onOrAfter(Version.V_3_0_0)) { + out.writeOptionalLong(cancellationStartTime); + } } public TaskId getTaskId() { @@ -228,6 +271,10 @@ public boolean isCancelled() { return cancelled; } + public Long getCancellationStartTime() { + return cancellationStartTime; + } + /** * Returns the parent task id */ @@ -281,6 +328,9 @@ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws resourceStats.toXContent(builder, params); builder.endObject(); } + if (cancellationStartTime != null) { + builder.humanReadableField("cancellation_time_millis", "cancellation_time", new TimeValue(cancellationStartTime)); + } return builder; } @@ -308,6 +358,7 @@ public static TaskInfo fromXContent(XContentParser parser) { } @SuppressWarnings("unchecked") TaskResourceStats resourceStats = (TaskResourceStats) a[i++]; + Long cancellationStartTime = (Long) a[i++]; RawTaskStatus status = statusBytes == null ? null : new RawTaskStatus(statusBytes); TaskId parentTaskId = parentTaskIdString == null ? TaskId.EMPTY_TASK_ID : new TaskId(parentTaskIdString); return new TaskInfo( @@ -322,7 +373,8 @@ public static TaskInfo fromXContent(XContentParser parser) { cancelled, parentTaskId, headers, - resourceStats + resourceStats, + cancellationStartTime ); }); static { @@ -341,6 +393,7 @@ public static TaskInfo fromXContent(XContentParser parser) { PARSER.declareString(optionalConstructorArg(), new ParseField("parent_task_id")); PARSER.declareObject(optionalConstructorArg(), (p, c) -> p.mapStrings(), new ParseField("headers")); PARSER.declareObject(optionalConstructorArg(), (p, c) -> TaskResourceStats.fromXContent(p), new ParseField("resource_stats")); + PARSER.declareLong(optionalConstructorArg(), new ParseField("cancellation_time_millis")); } @Override @@ -366,7 +419,8 @@ public boolean equals(Object obj) { && Objects.equals(cancelled, other.cancelled) && Objects.equals(status, other.status) && Objects.equals(headers, other.headers) - && Objects.equals(resourceStats, other.resourceStats); + && Objects.equals(resourceStats, other.resourceStats) + && Objects.equals(cancellationStartTime, other.cancellationStartTime); } @Override @@ -383,7 +437,8 @@ public int hashCode() { cancelled, status, headers, - resourceStats + resourceStats, + cancellationStartTime ); } } diff --git a/server/src/test/java/org/opensearch/action/admin/cluster/node/tasks/TaskTests.java b/server/src/test/java/org/opensearch/action/admin/cluster/node/tasks/TaskTests.java index 45db94577f15f..5ef99bae698a3 100644 --- a/server/src/test/java/org/opensearch/action/admin/cluster/node/tasks/TaskTests.java +++ b/server/src/test/java/org/opensearch/action/admin/cluster/node/tasks/TaskTests.java @@ -89,6 +89,7 @@ public void testCancellableOptionWhenCancelledTrue() { long taskId = randomIntBetween(0, 100000); long startTime = randomNonNegativeLong(); long runningTime = randomNonNegativeLong(); + long cancellationStartTime = randomNonNegativeLong(); boolean cancellable = true; boolean cancelled = true; TaskInfo taskInfo = new TaskInfo( @@ -103,12 +104,14 @@ public void testCancellableOptionWhenCancelledTrue() { cancelled, TaskId.EMPTY_TASK_ID, Collections.singletonMap("foo", "bar"), - randomResourceStats(randomBoolean()) + randomResourceStats(randomBoolean()), + cancellationStartTime ); String taskInfoString = taskInfo.toString(); Map map = XContentHelper.convertToMap(new BytesArray(taskInfoString.getBytes(StandardCharsets.UTF_8)), true).v2(); assertEquals(map.get("cancellable"), cancellable); assertEquals(map.get("cancelled"), cancelled); + assertEquals(map.get("cancellation_time_millis"), cancellationStartTime); } public void testCancellableOptionWhenCancelledFalse() { diff --git a/server/src/test/java/org/opensearch/cluster/routing/OperationRoutingTests.java b/server/src/test/java/org/opensearch/cluster/routing/OperationRoutingTests.java index e9cbb42369a19..55885fb61ee0c 100644 --- a/server/src/test/java/org/opensearch/cluster/routing/OperationRoutingTests.java +++ b/server/src/test/java/org/opensearch/cluster/routing/OperationRoutingTests.java @@ -46,11 +46,13 @@ import org.opensearch.common.unit.TimeValue; import org.opensearch.common.util.io.IOUtils; import org.opensearch.index.Index; +import org.opensearch.index.IndexModule; import org.opensearch.index.shard.ShardId; import org.opensearch.node.ResponseCollectorService; import org.opensearch.test.ClusterServiceUtils; import org.opensearch.test.OpenSearchTestCase; import org.opensearch.threadpool.TestThreadPool; +import org.opensearch.threadpool.ThreadPool; import java.io.IOException; import java.util.ArrayList; @@ -992,6 +994,66 @@ public void testWeightedOperationRoutingWeightUndefinedForOneZone() throws Excep } } + public void testSearchableSnapshotPrimaryDefault() throws Exception { + final int numIndices = 1; + final int numShards = 2; + final int numReplicas = 2; + final String[] indexNames = new String[numIndices]; + for (int i = 0; i < numIndices; i++) { + indexNames[i] = "test" + i; + } + // The first index is a searchable snapshot index + final String searchableSnapshotIndex = indexNames[0]; + ClusterService clusterService = null; + ThreadPool threadPool = null; + + try { + OperationRouting opRouting = new OperationRouting( + Settings.EMPTY, + new ClusterSettings(Settings.EMPTY, ClusterSettings.BUILT_IN_CLUSTER_SETTINGS) + ); + + ClusterState state = ClusterStateCreationUtils.stateWithAssignedPrimariesAndReplicas(indexNames, numShards, numReplicas); + threadPool = new TestThreadPool("testSearchableSnapshotPreference"); + clusterService = ClusterServiceUtils.createClusterService(threadPool); + + // Update the index config within the cluster state to modify the index to a searchable snapshot index + IndexMetadata searchableSnapshotIndexMetadata = IndexMetadata.builder(searchableSnapshotIndex) + .settings( + Settings.builder() + .put(state.metadata().index(searchableSnapshotIndex).getSettings()) + .put(IndexModule.INDEX_STORE_TYPE_SETTING.getKey(), IndexModule.Type.REMOTE_SNAPSHOT.getSettingsKey()) + .build() + ) + .build(); + Metadata.Builder metadataBuilder = Metadata.builder(state.metadata()) + .put(searchableSnapshotIndexMetadata, false) + .generateClusterUuidIfNeeded(); + state = ClusterState.builder(state).metadata(metadataBuilder.build()).build(); + + // Verify default preference is primary only + GroupShardsIterator groupIterator = opRouting.searchShards(state, indexNames, null, null); + assertThat("One group per index shard", groupIterator.size(), equalTo(numIndices * numShards)); + + for (ShardIterator shardIterator : groupIterator) { + assertThat("Only single shard will be returned with no preference", shardIterator.size(), equalTo(1)); + assertTrue("Only primary should exist with no preference", shardIterator.nextOrNull().primary()); + } + + // Verify alternative preference can be applied to a searchable snapshot index + groupIterator = opRouting.searchShards(state, indexNames, null, "_replica"); + assertThat("One group per index shard", groupIterator.size(), equalTo(numIndices * numShards)); + + for (ShardIterator shardIterator : groupIterator) { + assertThat("Replica shards will be returned", shardIterator.size(), equalTo(numReplicas)); + assertFalse("Returned shard should be a replica", shardIterator.nextOrNull().primary()); + } + } finally { + IOUtils.close(clusterService); + terminate(threadPool); + } + } + private DiscoveryNode[] setupNodes() { // Sets up two data nodes in zone-a and one data node in zone-b List zones = Arrays.asList("a", "a", "b"); diff --git a/server/src/test/java/org/opensearch/cluster/structure/RoutingIteratorTests.java b/server/src/test/java/org/opensearch/cluster/structure/RoutingIteratorTests.java index cd645b08a119c..e24e8b2ad3f87 100644 --- a/server/src/test/java/org/opensearch/cluster/structure/RoutingIteratorTests.java +++ b/server/src/test/java/org/opensearch/cluster/structure/RoutingIteratorTests.java @@ -38,6 +38,7 @@ import org.opensearch.cluster.OpenSearchAllocationTestCase; import org.opensearch.cluster.metadata.IndexMetadata; import org.opensearch.cluster.metadata.Metadata; +import org.opensearch.cluster.node.DiscoveryNodeRole; import org.opensearch.cluster.node.DiscoveryNodes; import org.opensearch.cluster.routing.GroupShardsIterator; import org.opensearch.cluster.routing.IndexShardRoutingTable; @@ -55,6 +56,7 @@ import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.settings.ClusterSettings; import org.opensearch.common.settings.Settings; +import org.opensearch.index.IndexModule; import org.opensearch.index.shard.ShardId; import org.opensearch.test.ClusterServiceUtils; import org.opensearch.threadpool.TestThreadPool; @@ -562,6 +564,71 @@ public void testReplicaShardPreferenceIters() { assertTrue(routing.primary()); } + public void testSearchableSnapshotPreference() { + AllocationService strategy = createAllocationService( + Settings.builder().put("cluster.routing.allocation.node_concurrent_recoveries", 10).build() + ); + OperationRouting operationRouting = new OperationRouting( + Settings.EMPTY, + new ClusterSettings(Settings.EMPTY, ClusterSettings.BUILT_IN_CLUSTER_SETTINGS) + ); + + // Modify the index to be a searchable snapshot index + Metadata metadata = Metadata.builder() + .put( + IndexMetadata.builder("test") + .settings( + Settings.builder() + .put(settings(Version.CURRENT).build()) + .put(IndexModule.INDEX_STORE_TYPE_SETTING.getKey(), IndexModule.Type.REMOTE_SNAPSHOT.getSettingsKey()) + ) + .numberOfShards(2) + .numberOfReplicas(2) + ) + .build(); + + RoutingTable routingTable = RoutingTable.builder().addAsNew(metadata.index("test")).build(); + + ClusterState clusterState = ClusterState.builder(ClusterName.CLUSTER_NAME_SETTING.getDefault(Settings.EMPTY)) + .metadata(metadata) + .routingTable(routingTable) + .build(); + + clusterState = ClusterState.builder(clusterState) + .nodes( + DiscoveryNodes.builder() + .add(newNode("node1", Collections.singleton(DiscoveryNodeRole.CLUSTER_MANAGER_ROLE))) + .add(newNode("node2", Collections.singleton(DiscoveryNodeRole.SEARCH_ROLE))) + .add(newNode("node3", Collections.singleton(DiscoveryNodeRole.SEARCH_ROLE))) + .localNodeId("node1") + ) + .build(); + + clusterState = strategy.reroute(clusterState, "reroute"); // Move primaries to initializing + clusterState = strategy.applyStartedShards(clusterState, clusterState.getRoutingNodes().shardsWithState(INITIALIZING)); + + clusterState = strategy.reroute(clusterState, "reroute"); // Move replicas to initializing + clusterState = strategy.applyStartedShards(clusterState, clusterState.getRoutingNodes().shardsWithState(INITIALIZING)); + + // When replicas haven't initialized, it comes back with the primary first, then initializing replicas + GroupShardsIterator shardIterators = operationRouting.searchShards( + clusterState, + new String[] { "test" }, + null, + null + ); + assertThat(shardIterators.size(), equalTo(2)); // two potential shards + ShardIterator iter = shardIterators.iterator().next(); + assertThat(iter.size(), equalTo(1)); // one potential candidate (primary) for the shard + + ShardRouting routing = iter.nextOrNull(); + assertNotNull(routing); + assertTrue(routing.primary()); // Default preference is primary + assertTrue(routing.started()); + routing = iter.nextOrNull(); + assertNull(routing); // No other candidate apart from primary + } + public void testWeightedRoutingWithDifferentWeights() { TestThreadPool threadPool = null; try { diff --git a/server/src/test/java/org/opensearch/common/lucene/LuceneTests.java b/server/src/test/java/org/opensearch/common/lucene/LuceneTests.java index f61338af2ba8a..70d3e8d5671e2 100644 --- a/server/src/test/java/org/opensearch/common/lucene/LuceneTests.java +++ b/server/src/test/java/org/opensearch/common/lucene/LuceneTests.java @@ -339,7 +339,7 @@ public void testReadSegmentInfosExtendedCompatibility() throws IOException { try (MockDirectoryWrapper dir = newMockFSDirectory(tmp)) { // The standard API will throw an exception expectThrows(IndexFormatTooOldException.class, () -> Lucene.readSegmentInfos(dir)); - SegmentInfos si = Lucene.readSegmentInfosExtendedCompatibility(dir, minVersion); + SegmentInfos si = Lucene.readSegmentInfos(dir, minVersion); assertEquals(1, Lucene.getNumDocs(si)); IndexCommit indexCommit = Lucene.getIndexCommit(si, dir); // uses the "expert" Lucene API @@ -358,59 +358,6 @@ public void testReadSegmentInfosExtendedCompatibility() throws IOException { } } - /** - * Since the implementation in {@link Lucene#readSegmentInfosExtendedCompatibility(Directory, Version)} - * is a workaround, this test verifies that the response from this method is equivalent to - * {@link Lucene#readSegmentInfos(Directory)} if the version is N-1 - */ - public void testReadSegmentInfosExtendedCompatibilityBaseCase() throws IOException { - MockDirectoryWrapper dir = newMockDirectory(); - IndexWriterConfig iwc = newIndexWriterConfig(); - IndexWriter writer = new IndexWriter(dir, iwc); - Document doc = new Document(); - doc.add(new TextField("id", "1", random().nextBoolean() ? Field.Store.YES : Field.Store.NO)); - writer.addDocument(doc); - writer.commit(); - SegmentInfos expectedSI = Lucene.readSegmentInfos(dir); - SegmentInfos actualSI = Lucene.readSegmentInfosExtendedCompatibility(dir, Version.CURRENT); - assertEquals(Lucene.getNumDocs(expectedSI), Lucene.getNumDocs(actualSI)); - assertEquals(expectedSI.getGeneration(), actualSI.getGeneration()); - assertEquals(expectedSI.getSegmentsFileName(), actualSI.getSegmentsFileName()); - assertEquals(expectedSI.getVersion(), actualSI.getVersion()); - assertEquals(expectedSI.getCommitLuceneVersion(), actualSI.getCommitLuceneVersion()); - assertEquals(expectedSI.getMinSegmentLuceneVersion(), actualSI.getMinSegmentLuceneVersion()); - assertEquals(expectedSI.getIndexCreatedVersionMajor(), actualSI.getIndexCreatedVersionMajor()); - assertEquals(expectedSI.getUserData(), actualSI.getUserData()); - - int numDocsToIndex = randomIntBetween(10, 50); - List deleteTerms = new ArrayList<>(); - for (int i = 0; i < numDocsToIndex; i++) { - doc = new Document(); - doc.add(new TextField("id", "doc_" + i, random().nextBoolean() ? Field.Store.YES : Field.Store.NO)); - deleteTerms.add(new Term("id", "doc_" + i)); - writer.addDocument(doc); - } - int numDocsToDelete = randomIntBetween(0, numDocsToIndex); - Collections.shuffle(deleteTerms, random()); - for (int i = 0; i < numDocsToDelete; i++) { - Term remove = deleteTerms.remove(0); - writer.deleteDocuments(remove); - } - writer.commit(); - expectedSI = Lucene.readSegmentInfos(dir); - actualSI = Lucene.readSegmentInfosExtendedCompatibility(dir, Version.CURRENT); - assertEquals(Lucene.getNumDocs(expectedSI), Lucene.getNumDocs(actualSI)); - assertEquals(expectedSI.getGeneration(), actualSI.getGeneration()); - assertEquals(expectedSI.getSegmentsFileName(), actualSI.getSegmentsFileName()); - assertEquals(expectedSI.getVersion(), actualSI.getVersion()); - assertEquals(expectedSI.getCommitLuceneVersion(), actualSI.getCommitLuceneVersion()); - assertEquals(expectedSI.getMinSegmentLuceneVersion(), actualSI.getMinSegmentLuceneVersion()); - assertEquals(expectedSI.getIndexCreatedVersionMajor(), actualSI.getIndexCreatedVersionMajor()); - assertEquals(expectedSI.getUserData(), actualSI.getUserData()); - writer.close(); - dir.close(); - } - public void testCount() throws Exception { Directory dir = newDirectory(); RandomIndexWriter w = new RandomIndexWriter(random(), dir); diff --git a/server/src/test/java/org/opensearch/extensions/ExtensionsManagerTests.java b/server/src/test/java/org/opensearch/extensions/ExtensionsManagerTests.java index 392e7e02ebbcc..42a050270466d 100644 --- a/server/src/test/java/org/opensearch/extensions/ExtensionsManagerTests.java +++ b/server/src/test/java/org/opensearch/extensions/ExtensionsManagerTests.java @@ -48,7 +48,6 @@ import org.opensearch.common.util.FeatureFlags; import org.opensearch.env.EnvironmentSettingsResponse; import org.opensearch.cluster.metadata.IndexMetadata; -import org.opensearch.cluster.metadata.IndexNameExpressionResolver; import org.opensearch.cluster.node.DiscoveryNode; import org.opensearch.cluster.service.ClusterService; import org.opensearch.common.bytes.BytesReference; @@ -65,29 +64,20 @@ import org.opensearch.common.settings.SettingsModule; import org.opensearch.common.transport.TransportAddress; import org.opensearch.common.util.PageCacheRecycler; -import org.opensearch.common.util.concurrent.ThreadContext; import org.opensearch.env.Environment; -import org.opensearch.env.TestEnvironment; import org.opensearch.extensions.proto.ExtensionRequestProto; import org.opensearch.extensions.rest.RegisterRestActionsRequest; import org.opensearch.extensions.settings.RegisterCustomSettingsRequest; import org.opensearch.identity.IdentityService; -import org.opensearch.index.IndexModule; -import org.opensearch.index.IndexSettings; -import org.opensearch.index.analysis.AnalysisRegistry; -import org.opensearch.index.engine.EngineConfigFactory; -import org.opensearch.index.engine.InternalEngineFactory; import org.opensearch.indices.breaker.NoneCircuitBreakerService; import org.opensearch.rest.RestController; import org.opensearch.test.FeatureFlagSetter; -import org.opensearch.test.IndexSettingsModule; import org.opensearch.test.MockLogAppender; import org.opensearch.test.OpenSearchTestCase; import org.opensearch.test.client.NoOpNodeClient; import org.opensearch.test.transport.MockTransportService; import org.opensearch.threadpool.TestThreadPool; import org.opensearch.threadpool.ThreadPool; -import org.opensearch.transport.NodeNotConnectedException; import org.opensearch.transport.Transport; import org.opensearch.transport.TransportResponse; import org.opensearch.transport.TransportService; @@ -838,40 +828,6 @@ public void testRegisterHandler() throws Exception { } - public void testOnIndexModule() throws Exception { - Files.write(extensionDir.resolve("extensions.yml"), extensionsYmlLines, StandardCharsets.UTF_8); - ExtensionsManager extensionsManager = new ExtensionsManager(extensionDir); - initialize(extensionsManager); - - Environment environment = TestEnvironment.newEnvironment(settings); - AnalysisRegistry emptyAnalysisRegistry = new AnalysisRegistry( - environment, - emptyMap(), - emptyMap(), - emptyMap(), - emptyMap(), - emptyMap(), - emptyMap(), - emptyMap(), - emptyMap(), - emptyMap() - ); - - IndexSettings indexSettings = IndexSettingsModule.newIndexSettings("test_index", settings); - IndexModule indexModule = new IndexModule( - indexSettings, - emptyAnalysisRegistry, - new InternalEngineFactory(), - new EngineConfigFactory(indexSettings), - Collections.emptyMap(), - () -> true, - new IndexNameExpressionResolver(new ThreadContext(Settings.EMPTY)), - Collections.emptyMap() - ); - expectThrows(NodeNotConnectedException.class, () -> extensionsManager.onIndexModule(indexModule)); - - } - public void testIncompatibleExtensionRegistration() throws IOException, IllegalAccessException { try (MockLogAppender mockLogAppender = MockLogAppender.createForLoggers(LogManager.getLogger(ExtensionsManager.class))) { diff --git a/server/src/test/java/org/opensearch/nodesinfo/NodeInfoStreamingTests.java b/server/src/test/java/org/opensearch/nodesinfo/NodeInfoStreamingTests.java index bec921bc5bf5d..cdd1c682b40dc 100644 --- a/server/src/test/java/org/opensearch/nodesinfo/NodeInfoStreamingTests.java +++ b/server/src/test/java/org/opensearch/nodesinfo/NodeInfoStreamingTests.java @@ -251,7 +251,7 @@ private static NodeInfo createNodeInfo() { for (int i = 0; i < numProcessors; i++) { processors.add(new org.opensearch.search.pipeline.ProcessorInfo(randomAlphaOfLengthBetween(3, 10))); } - searchPipelineInfo = new SearchPipelineInfo(processors); + searchPipelineInfo = new SearchPipelineInfo(Map.of(randomAlphaOfLengthBetween(3, 10), processors)); } return new NodeInfo( diff --git a/server/src/test/java/org/opensearch/search/SearchServiceTests.java b/server/src/test/java/org/opensearch/search/SearchServiceTests.java index a09e05915b779..a17834bccb238 100644 --- a/server/src/test/java/org/opensearch/search/SearchServiceTests.java +++ b/server/src/test/java/org/opensearch/search/SearchServiceTests.java @@ -35,6 +35,7 @@ import org.apache.lucene.index.DirectoryReader; import org.apache.lucene.index.FilterDirectoryReader; import org.apache.lucene.index.LeafReader; +import org.apache.lucene.search.FieldDoc; import org.apache.lucene.search.Query; import org.apache.lucene.store.AlreadyClosedException; import org.opensearch.action.ActionListener; @@ -104,6 +105,9 @@ import org.opensearch.search.internal.ShardSearchContextId; import org.opensearch.search.internal.ShardSearchRequest; import org.opensearch.search.query.QuerySearchResult; +import org.opensearch.search.sort.FieldSortBuilder; +import org.opensearch.search.sort.MinAndMax; +import org.opensearch.search.sort.SortOrder; import org.opensearch.search.suggest.SuggestBuilder; import org.opensearch.test.OpenSearchSingleNodeTestCase; import org.opensearch.threadpool.ThreadPool; @@ -1562,4 +1566,82 @@ public void validatePitStats(String index, long expectedPitCurrent, long expecte assertEquals(expectedPitCurrent, indexShard.searchStats().getTotal().getPitCurrent()); assertEquals(expectedPitCount, indexShard.searchStats().getTotal().getPitCount()); } + + /** + * Test for ASC order search_after query. + * Min = 0L, Max = 9L, search_after = 10L + * Expected result is canMatch = false + */ + public void testCanMatchSearchAfterAscGreaterThanMax() throws IOException { + FieldDoc searchAfter = new FieldDoc(0, 0, new Long[] { 10L }); + MinAndMax minMax = new MinAndMax(0L, 9L); + FieldSortBuilder primarySort = new FieldSortBuilder("test"); + primarySort.order(SortOrder.ASC); + assertEquals(SearchService.canMatchSearchAfter(searchAfter, minMax, primarySort), false); + } + + /** + * Test for ASC order search_after query. + * Min = 0L, Max = 9L, search_after = 7L + * Expected result is canMatch = true + */ + public void testCanMatchSearchAfterAscLessThanMax() throws IOException { + FieldDoc searchAfter = new FieldDoc(0, 0, new Long[] { 7L }); + MinAndMax minMax = new MinAndMax(0L, 9L); + FieldSortBuilder primarySort = new FieldSortBuilder("test"); + primarySort.order(SortOrder.ASC); + assertEquals(SearchService.canMatchSearchAfter(searchAfter, minMax, primarySort), true); + } + + /** + * Test for ASC order search_after query. + * Min = 0L, Max = 9L, search_after = 9L + * Expected result is canMatch = true + */ + public void testCanMatchSearchAfterAscEqualMax() throws IOException { + FieldDoc searchAfter = new FieldDoc(0, 0, new Long[] { 9L }); + MinAndMax minMax = new MinAndMax(0L, 9L); + FieldSortBuilder primarySort = new FieldSortBuilder("test"); + primarySort.order(SortOrder.ASC); + assertEquals(SearchService.canMatchSearchAfter(searchAfter, minMax, primarySort), true); + } + + /** + * Test for DESC order search_after query. + * Min = 0L, Max = 9L, search_after = 10L + * Expected result is canMatch = true + */ + public void testCanMatchSearchAfterDescGreaterThanMin() throws IOException { + FieldDoc searchAfter = new FieldDoc(0, 0, new Long[] { 10L }); + MinAndMax minMax = new MinAndMax(0L, 9L); + FieldSortBuilder primarySort = new FieldSortBuilder("test"); + primarySort.order(SortOrder.DESC); + assertEquals(SearchService.canMatchSearchAfter(searchAfter, minMax, primarySort), true); + } + + /** + * Test for DESC order search_after query. + * Min = 0L, Max = 9L, search_after = -1L + * Expected result is canMatch = false + */ + public void testCanMatchSearchAfterDescLessThanMin() throws IOException { + FieldDoc searchAfter = new FieldDoc(0, 0, new Long[] { -1L }); + MinAndMax minMax = new MinAndMax(0L, 9L); + FieldSortBuilder primarySort = new FieldSortBuilder("test"); + primarySort.order(SortOrder.DESC); + assertEquals(SearchService.canMatchSearchAfter(searchAfter, minMax, primarySort), false); + } + + /** + * Test for DESC order search_after query. + * Min = 0L, Max = 9L, search_after = 0L + * Expected result is canMatch = true + */ + public void testCanMatchSearchAfterDescEqualMin() throws IOException { + FieldDoc searchAfter = new FieldDoc(0, 0, new Long[] { 0L }); + MinAndMax minMax = new MinAndMax(0L, 9L); + FieldSortBuilder primarySort = new FieldSortBuilder("test"); + primarySort.order(SortOrder.DESC); + assertEquals(SearchService.canMatchSearchAfter(searchAfter, minMax, primarySort), true); + } } diff --git a/server/src/test/java/org/opensearch/search/pipeline/SearchPipelineInfoTests.java b/server/src/test/java/org/opensearch/search/pipeline/SearchPipelineInfoTests.java new file mode 100644 index 0000000000000..6eb137cb28e8f --- /dev/null +++ b/server/src/test/java/org/opensearch/search/pipeline/SearchPipelineInfoTests.java @@ -0,0 +1,75 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ + +package org.opensearch.search.pipeline; + +import org.opensearch.Version; +import org.opensearch.common.io.stream.BytesStreamOutput; +import org.opensearch.common.io.stream.StreamInput; +import org.opensearch.test.OpenSearchTestCase; + +import java.io.IOException; +import java.util.List; +import java.util.Map; + +public class SearchPipelineInfoTests extends OpenSearchTestCase { + public void testSerializationRoundtrip() throws IOException { + SearchPipelineInfo searchPipelineInfo = new SearchPipelineInfo( + Map.of( + "a", + List.of(new ProcessorInfo("a1"), new ProcessorInfo("a2"), new ProcessorInfo("a3")), + "b", + List.of(new ProcessorInfo("b1"), new ProcessorInfo("b2")), + "c", + List.of(new ProcessorInfo("c1")) + ) + ); + SearchPipelineInfo deserialized; + try (BytesStreamOutput bytesStreamOutput = new BytesStreamOutput()) { + searchPipelineInfo.writeTo(bytesStreamOutput); + try (StreamInput bytesStreamInput = bytesStreamOutput.bytes().streamInput()) { + deserialized = new SearchPipelineInfo(bytesStreamInput); + } + } + assertTrue(deserialized.containsProcessor("a", "a1")); + assertTrue(deserialized.containsProcessor("a", "a2")); + assertTrue(deserialized.containsProcessor("a", "a3")); + assertTrue(deserialized.containsProcessor("b", "b1")); + assertTrue(deserialized.containsProcessor("b", "b2")); + assertTrue(deserialized.containsProcessor("c", "c1")); + } + + /** + * When serializing / deserializing to / from old versions, processor type info is lost. + * + * Also, we only supported request/response processors. + */ + public void testSerializationRoundtripBackcompat() throws IOException { + SearchPipelineInfo searchPipelineInfo = new SearchPipelineInfo( + Map.of( + Pipeline.REQUEST_PROCESSORS_KEY, + List.of(new ProcessorInfo("a1"), new ProcessorInfo("a2"), new ProcessorInfo("a3")), + Pipeline.RESPONSE_PROCESSORS_KEY, + List.of(new ProcessorInfo("b1"), new ProcessorInfo("b2")) + ) + ); + SearchPipelineInfo deserialized; + try (BytesStreamOutput bytesStreamOutput = new BytesStreamOutput()) { + bytesStreamOutput.setVersion(Version.V_2_7_0); + searchPipelineInfo.writeTo(bytesStreamOutput); + try (StreamInput bytesStreamInput = bytesStreamOutput.bytes().streamInput()) { + bytesStreamInput.setVersion(Version.V_2_7_0); + deserialized = new SearchPipelineInfo(bytesStreamInput); + } + } + for (String proc : List.of("a1", "a2", "a3", "b1", "b2")) { + assertTrue(deserialized.containsProcessor(Pipeline.REQUEST_PROCESSORS_KEY, proc)); + assertTrue(deserialized.containsProcessor(Pipeline.RESPONSE_PROCESSORS_KEY, proc)); + } + } +} diff --git a/server/src/test/java/org/opensearch/search/pipeline/SearchPipelineServiceTests.java b/server/src/test/java/org/opensearch/search/pipeline/SearchPipelineServiceTests.java index 36978d5310810..516227e9a13d8 100644 --- a/server/src/test/java/org/opensearch/search/pipeline/SearchPipelineServiceTests.java +++ b/server/src/test/java/org/opensearch/search/pipeline/SearchPipelineServiceTests.java @@ -60,9 +60,13 @@ public class SearchPipelineServiceTests extends OpenSearchTestCase { private static final SearchPipelinePlugin DUMMY_PLUGIN = new SearchPipelinePlugin() { @Override - public Map getProcessors(Processor.Parameters parameters) { + public Map> getRequestProcessors(Processor.Parameters parameters) { return Map.of("foo", (factories, tag, description, config) -> null); } + + public Map> getResponseProcessors(Processor.Parameters parameters) { + return Map.of("bar", (factories, tag, description, config) -> null); + } }; private ThreadPool threadPool; @@ -89,9 +93,14 @@ public void testSearchPipelinePlugin() { client, false ); - Map factories = searchPipelineService.getProcessorFactories(); - assertEquals(1, factories.size()); - assertTrue(factories.containsKey("foo")); + Map> requestProcessorFactories = searchPipelineService + .getRequestProcessorFactories(); + assertEquals(1, requestProcessorFactories.size()); + assertTrue(requestProcessorFactories.containsKey("foo")); + Map> responseProcessorFactories = searchPipelineService + .getResponseProcessorFactories(); + assertEquals(1, responseProcessorFactories.size()); + assertTrue(responseProcessorFactories.containsKey("bar")); } public void testSearchPipelinePluginDuplicate() { @@ -235,8 +244,8 @@ public SearchResponse processResponse(SearchRequest request, SearchResponse resp } private SearchPipelineService createWithProcessors() { - Map processors = new HashMap<>(); - processors.put("scale_request_size", (processorFactories, tag, description, config) -> { + Map> requestProcessors = new HashMap<>(); + requestProcessors.put("scale_request_size", (processorFactories, tag, description, config) -> { float scale = ((Number) config.remove("scale")).floatValue(); return new FakeRequestProcessor( "scale_request_size", @@ -245,11 +254,12 @@ private SearchPipelineService createWithProcessors() { req -> req.source().size((int) (req.source().size() * scale)) ); }); - processors.put("fixed_score", (processorFactories, tag, description, config) -> { + Map> responseProcessors = new HashMap<>(); + responseProcessors.put("fixed_score", (processorFactories, tag, description, config) -> { float score = ((Number) config.remove("score")).floatValue(); return new FakeResponseProcessor("fixed_score", tag, description, rsp -> rsp.getHits().forEach(h -> h.score(score))); }); - return createWithProcessors(processors); + return createWithProcessors(requestProcessors, responseProcessors); } @Override @@ -258,7 +268,10 @@ protected NamedWriteableRegistry writableRegistry() { return new NamedWriteableRegistry(searchModule.getNamedWriteables()); } - private SearchPipelineService createWithProcessors(Map processors) { + private SearchPipelineService createWithProcessors( + Map> requestProcessors, + Map> responseProcessors + ) { Client client = mock(Client.class); ThreadPool threadPool = mock(ThreadPool.class); ExecutorService executorService = OpenSearchExecutors.newDirectExecutorService(); @@ -274,8 +287,13 @@ private SearchPipelineService createWithProcessors(Map getProcessors(Processor.Parameters parameters) { - return processors; + public Map> getRequestProcessors(Processor.Parameters parameters) { + return requestProcessors; + } + + @Override + public Map> getResponseProcessors(Processor.Parameters parameters) { + return responseProcessors; } }), client, @@ -619,13 +637,14 @@ public void testValidatePipeline() throws Exception { XContentType.JSON ); + SearchPipelineInfo completePipelineInfo = new SearchPipelineInfo( + Map.of(Pipeline.REQUEST_PROCESSORS_KEY, List.of(reqProcessor), Pipeline.RESPONSE_PROCESSORS_KEY, List.of(rspProcessor)) + ); + SearchPipelineInfo incompletePipelineInfo = new SearchPipelineInfo(Map.of(Pipeline.REQUEST_PROCESSORS_KEY, List.of(reqProcessor))); // One node is missing a processor expectThrows( OpenSearchParseException.class, - () -> searchPipelineService.validatePipeline( - Map.of(n1, new SearchPipelineInfo(List.of(reqProcessor, rspProcessor)), n2, new SearchPipelineInfo(List.of(reqProcessor))), - putRequest - ) + () -> searchPipelineService.validatePipeline(Map.of(n1, completePipelineInfo, n2, incompletePipelineInfo), putRequest) ); // Discovery failed, no infos passed. @@ -644,27 +663,11 @@ public void testValidatePipeline() throws Exception { ); expectThrows( ClassCastException.class, - () -> searchPipelineService.validatePipeline( - Map.of( - n1, - new SearchPipelineInfo(List.of(reqProcessor, rspProcessor)), - n2, - new SearchPipelineInfo(List.of(reqProcessor, rspProcessor)) - ), - badPutRequest - ) + () -> searchPipelineService.validatePipeline(Map.of(n1, completePipelineInfo, n2, completePipelineInfo), badPutRequest) ); // Success - searchPipelineService.validatePipeline( - Map.of( - n1, - new SearchPipelineInfo(List.of(reqProcessor, rspProcessor)), - n2, - new SearchPipelineInfo(List.of(reqProcessor, rspProcessor)) - ), - putRequest - ); + searchPipelineService.validatePipeline(Map.of(n1, completePipelineInfo, n2, completePipelineInfo), putRequest); } /** @@ -717,7 +720,7 @@ public void testInlinePipeline() throws Exception { public void testInfo() { SearchPipelineService searchPipelineService = createWithProcessors(); SearchPipelineInfo info = searchPipelineService.info(); - assertTrue(info.containsProcessor("scale_request_size")); - assertTrue(info.containsProcessor("fixed_score")); + assertTrue(info.containsProcessor(Pipeline.REQUEST_PROCESSORS_KEY, "scale_request_size")); + assertTrue(info.containsProcessor(Pipeline.RESPONSE_PROCESSORS_KEY, "fixed_score")); } } diff --git a/server/src/test/java/org/opensearch/search/sort/MinAndMaxTests.java b/server/src/test/java/org/opensearch/search/sort/MinAndMaxTests.java new file mode 100644 index 0000000000000..7557a81db0ad3 --- /dev/null +++ b/server/src/test/java/org/opensearch/search/sort/MinAndMaxTests.java @@ -0,0 +1,61 @@ +/* + * SPDX-License-Identifier: Apache-2.0 + * + * The OpenSearch Contributors require contributions made to + * this file be licensed under the Apache-2.0 license or a + * compatible open source license. + */ + +/* + * Licensed to Elasticsearch under one or more contributor + * license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright + * ownership. Elasticsearch licenses this file to you under + * the Apache License, Version 2.0 (the "License"); you may + * not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, + * software distributed under the License is distributed on an + * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + * KIND, either express or implied. See the License for the + * specific language governing permissions and limitations + * under the License. + */ + +/* + * Modifications Copyright OpenSearch Contributors. See + * GitHub history for details. + */ + +package org.opensearch.search.sort; + +import org.apache.lucene.util.BytesRef; +import org.opensearch.test.OpenSearchTestCase; + +import java.math.BigInteger; + +public class MinAndMaxTests extends OpenSearchTestCase { + + public void testCompareMin() { + assertEquals(true, new MinAndMax(0L, 9L).compareMin(15L) < 0); // LONG + assertEquals(true, new MinAndMax(0, 9).compareMin(15) < 0); // INT + assertEquals(true, new MinAndMax(0f, 9f).compareMin(15f) < 0); // FLOAT + assertEquals(true, new MinAndMax(0d, 9d).compareMin(15d) < 0); // DOUBLE + assertEquals(true, new MinAndMax(BigInteger.valueOf(0), BigInteger.valueOf(9)).compareMin(BigInteger.valueOf(15)) < 0); // BigInteger + assertEquals(true, new MinAndMax(new BytesRef("a"), new BytesRef("b")).compareMin(new BytesRef("c")) < 0); // ByteRef + expectThrows(UnsupportedOperationException.class, () -> new MinAndMax("a", "b").compareMin("c")); + } + + public void testCompareMax() { + assertEquals(true, new MinAndMax(0L, 9L).compareMax(15L) < 0); // LONG + assertEquals(true, new MinAndMax(0, 9).compareMax(15) < 0); // INT + assertEquals(true, new MinAndMax(0f, 9f).compareMax(15f) < 0); // FLOAT + assertEquals(true, new MinAndMax(0d, 9d).compareMax(15d) < 0); // DOUBLE + assertEquals(true, new MinAndMax(BigInteger.valueOf(0), BigInteger.valueOf(9)).compareMax(BigInteger.valueOf(15)) < 0); // BigInteger + assertEquals(true, new MinAndMax(new BytesRef("a"), new BytesRef("b")).compareMax(new BytesRef("c")) < 0); // ByteRef + expectThrows(UnsupportedOperationException.class, () -> new MinAndMax("a", "b").compareMax("c")); + } +} diff --git a/server/src/test/java/org/opensearch/snapshots/SnapshotResiliencyTests.java b/server/src/test/java/org/opensearch/snapshots/SnapshotResiliencyTests.java index a51dc1c770f26..efa1db17d0461 100644 --- a/server/src/test/java/org/opensearch/snapshots/SnapshotResiliencyTests.java +++ b/server/src/test/java/org/opensearch/snapshots/SnapshotResiliencyTests.java @@ -195,7 +195,6 @@ import org.opensearch.ingest.IngestService; import org.opensearch.monitor.StatusInfo; import org.opensearch.node.ResponseCollectorService; -import org.opensearch.extensions.ExtensionsManager; import org.opensearch.plugins.PluginsService; import org.opensearch.repositories.RepositoriesService; import org.opensearch.repositories.Repository; @@ -1807,7 +1806,6 @@ public void onFailure(final Exception e) { indicesService = new IndicesService( settings, mock(PluginsService.class), - mock(ExtensionsManager.class), nodeEnv, namedXContentRegistry, new AnalysisRegistry( diff --git a/server/src/test/java/org/opensearch/tasks/ListTasksResponseTests.java b/server/src/test/java/org/opensearch/tasks/ListTasksResponseTests.java index d0002eb04914a..343f5b79a25a8 100644 --- a/server/src/test/java/org/opensearch/tasks/ListTasksResponseTests.java +++ b/server/src/test/java/org/opensearch/tasks/ListTasksResponseTests.java @@ -78,7 +78,8 @@ public void testNonEmptyToString() { { put("dummy-type1", new TaskResourceUsage(100, 100)); } - }) + }), + 0L ); ListTasksResponse tasksResponse = new ListTasksResponse(singletonList(info), emptyList(), emptyList()); assertEquals( @@ -105,7 +106,9 @@ public void testNonEmptyToString() { + " \"cpu_time_in_nanos\" : 100,\n" + " \"memory_in_bytes\" : 100\n" + " }\n" - + " }\n" + + " },\n" + + " \"cancellation_time\" : \"0s\",\n" + + " \"cancellation_time_millis\" : 0\n" + " }\n" + " ]\n" + "}", diff --git a/server/src/test/java/org/opensearch/tasks/TaskInfoTests.java b/server/src/test/java/org/opensearch/tasks/TaskInfoTests.java index c263972c9cb97..359b6898fa6b9 100644 --- a/server/src/test/java/org/opensearch/tasks/TaskInfoTests.java +++ b/server/src/test/java/org/opensearch/tasks/TaskInfoTests.java @@ -83,7 +83,7 @@ protected Predicate getRandomFieldsExcludeFilter() { @Override protected TaskInfo mutateInstance(TaskInfo info) { - switch (between(0, 10)) { + switch (between(0, 11)) { case 0: TaskId taskId = new TaskId(info.getTaskId().getNodeId() + randomAlphaOfLength(5), info.getTaskId().getId()); return new TaskInfo( @@ -98,7 +98,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), info.getParentTaskId(), info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 1: return new TaskInfo( @@ -113,7 +114,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), info.getParentTaskId(), info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 2: return new TaskInfo( @@ -128,7 +130,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), info.getParentTaskId(), info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 3: return new TaskInfo( @@ -143,7 +146,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), info.getParentTaskId(), info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 4: Task.Status newStatus = randomValueOtherThan(info.getStatus(), TaskInfoTests::randomRawTaskStatus); @@ -159,7 +163,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), info.getParentTaskId(), info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 5: return new TaskInfo( @@ -174,7 +179,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), info.getParentTaskId(), info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 6: return new TaskInfo( @@ -189,7 +195,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), info.getParentTaskId(), info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 7: return new TaskInfo( @@ -204,7 +211,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { false, info.getParentTaskId(), info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 8: TaskId parentId = new TaskId(info.getParentTaskId().getNodeId() + randomAlphaOfLength(5), info.getParentTaskId().getId()); @@ -220,7 +228,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), parentId, info.getHeaders(), - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 9: Map headers = info.getHeaders(); @@ -242,7 +251,8 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.isCancelled(), info.getParentTaskId(), headers, - info.getResourceStats() + info.getResourceStats(), + info.getCancellationStartTime() ); case 10: Map resourceUsageMap; @@ -266,6 +276,22 @@ protected TaskInfo mutateInstance(TaskInfo info) { info.getHeaders(), new TaskResourceStats(resourceUsageMap) ); + case 11: + return new TaskInfo( + info.getTaskId(), + info.getType(), + info.getAction(), + info.getDescription(), + info.getStatus(), + info.getStartTime(), + info.getRunningTimeNanos(), + true, + true, + info.getParentTaskId(), + info.getHeaders(), + info.getResourceStats(), + randomNonNegativeLong() + ); default: throw new IllegalStateException(); } @@ -285,6 +311,10 @@ static TaskInfo randomTaskInfo(boolean detailed) { long runningTimeNanos = randomLong(); boolean cancellable = randomBoolean(); boolean cancelled = cancellable == true ? randomBoolean() : false; + Long cancellationStartTime = null; + if (cancelled) { + cancellationStartTime = randomNonNegativeLong(); + } TaskId parentTaskId = randomBoolean() ? TaskId.EMPTY_TASK_ID : randomTaskId(); Map headers = randomBoolean() ? Collections.emptyMap() @@ -301,7 +331,8 @@ static TaskInfo randomTaskInfo(boolean detailed) { cancelled, parentTaskId, headers, - randomResourceStats(detailed) + randomResourceStats(detailed), + cancellationStartTime ); } @@ -312,7 +343,7 @@ private static TaskId randomTaskId() { private static RawTaskStatus randomRawTaskStatus() { try (XContentBuilder builder = XContentBuilder.builder(Requests.INDEX_CONTENT_TYPE.xContent())) { builder.startObject(); - int fields = between(0, 10); + int fields = between(0, 11); for (int f = 0; f < fields; f++) { builder.field(randomAlphaOfLength(5), randomAlphaOfLength(5)); } diff --git a/test/fixtures/hdfs-fixture/build.gradle b/test/fixtures/hdfs-fixture/build.gradle index 0eec2a4e9daa9..70b84a405c9c6 100644 --- a/test/fixtures/hdfs-fixture/build.gradle +++ b/test/fixtures/hdfs-fixture/build.gradle @@ -55,7 +55,7 @@ dependencies { api "com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:${versions.jackson}" api "com.fasterxml.jackson.core:jackson-databind:${versions.jackson_databind}" api "com.fasterxml.woodstox:woodstox-core:${versions.woodstox}" - api 'net.minidev:json-smart:2.4.10' + api 'net.minidev:json-smart:2.4.11' api "org.mockito:mockito-core:${versions.mockito}" api "com.google.protobuf:protobuf-java:${versions.protobuf}" api "org.jetbrains.kotlin:kotlin-stdlib:${versions.kotlin}" diff --git a/test/framework/src/main/java/org/opensearch/script/MockScriptEngine.java b/test/framework/src/main/java/org/opensearch/script/MockScriptEngine.java index 9f8abdf5f2c8a..98912e53c9d6a 100644 --- a/test/framework/src/main/java/org/opensearch/script/MockScriptEngine.java +++ b/test/framework/src/main/java/org/opensearch/script/MockScriptEngine.java @@ -177,6 +177,14 @@ public void execute(Map ctx) { } }; return context.factoryClazz.cast(factory); + } else if (context.instanceClazz.equals(SearchScript.class)) { + SearchScript.Factory factory = parameters -> new SearchScript(parameters) { + @Override + public void execute(Map ctx) { + script.apply(ctx); + } + }; + return context.factoryClazz.cast(factory); } else if (context.instanceClazz.equals(AggregationScript.class)) { return context.factoryClazz.cast(new MockAggregationScript(script)); } else if (context.instanceClazz.equals(IngestConditionalScript.class)) {