Skip to content

Commit

Permalink
Apply formatting.xml in all lines
Browse files Browse the repository at this point in the history
Signed-off-by: Varun Jain <[email protected]>
  • Loading branch information
vibrantvarun committed Dec 28, 2023
1 parent d7c6151 commit 820917b
Show file tree
Hide file tree
Showing 131 changed files with 602 additions and 248 deletions.
2 changes: 1 addition & 1 deletion .github/draft-release-notes-config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,4 +37,4 @@ categories:
- 'Maintenance'
- title: 'Refactoring'
labels:
- 'Refactoring'
- 'Refactoring'
38 changes: 19 additions & 19 deletions .github/workflows/add-untriaged.yml
Original file line number Diff line number Diff line change
@@ -1,19 +1,19 @@
name: Apply 'untriaged' label during issue lifecycle

on:
issues:
types: [opened, reopened, transferred]

jobs:
apply-label:
runs-on: ubuntu-latest
steps:
- uses: actions/github-script@v6
with:
script: |
github.rest.issues.addLabels({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['untriaged']
})
name: Apply 'untriaged' label during issue lifecycle

on:
issues:
types: [opened, reopened, transferred]

jobs:
apply-label:
runs-on: ubuntu-latest
steps:
- uses: actions/github-script@v6
with:
script: |
github.rest.issues.addLabels({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ['untriaged']
})
Original file line number Diff line number Diff line change
Expand Up @@ -77,4 +77,4 @@ jobs:
name: Run NeuralSearch Rolling-Upgrade BWC Tests from BWCVersion-${{ matrix.bwc_version }} to OpenSearch Version-${{ matrix.opensearch_version }} on Ubuntu
run: |
echo "Running rolling-upgrade backwards compatibility tests ..."
./gradlew :qa:rolling-upgrade:testRollingUpgrade -Dtests.bwc.version=$BWC_VERSION_ROLLING_UPGRADE
./gradlew :qa:rolling-upgrade:testRollingUpgrade -Dtests.bwc.version=$BWC_VERSION_ROLLING_UPGRADE
6 changes: 3 additions & 3 deletions .github/workflows/delete_backport_branch.yml
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
name: Delete merged branch of the backport PRs
on:
on:
pull_request:
types:
- closed

jobs:
delete-branch:
runs-on: ubuntu-latest
Expand All @@ -12,4 +12,4 @@ jobs:
- name: Delete merged branch
uses: SvanBoxel/delete-merged-branch@main
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
2 changes: 1 addition & 1 deletion .github/workflows/links.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,4 @@ jobs:
env:
GITHUB_TOKEN: ${{secrets.GITHUB_TOKEN}}
- name: Fail if there were link errors
run: exit ${{ steps.lychee.outputs.exit_code }}
run: exit ${{ steps.lychee.outputs.exit_code }}
14 changes: 7 additions & 7 deletions DEVELOPER_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ git clone https://github.com/[your username]/neural-search.git

#### JDK 11

OpenSearch builds using Java 11 at a minimum. This means you must have a JDK 11 installed with the environment variable
OpenSearch builds using Java 11 at a minimum. This means you must have a JDK 11 installed with the environment variable
`JAVA_HOME` referencing the path to Java home for your JDK 11 installation, e.g. `JAVA_HOME=/usr/lib/jvm/jdk-11`.

One easy way to get Java 11 on *nix is to use [sdkman](https://sdkman.io/).
Expand Down Expand Up @@ -83,10 +83,10 @@ Please follow these formatting guidelines:

## Build

OpenSearch neural-search uses a [Gradle](https://docs.gradle.org/6.6.1/userguide/userguide.html) wrapper for its build.
OpenSearch neural-search uses a [Gradle](https://docs.gradle.org/6.6.1/userguide/userguide.html) wrapper for its build.
Run `gradlew` on Unix systems.

Build OpenSearch neural-search using `gradlew build`
Build OpenSearch neural-search using `gradlew build`

```
./gradlew build
Expand Down Expand Up @@ -221,8 +221,8 @@ See [CONTRIBUTING](CONTRIBUTING.md).

## Backports

The Github workflow in [`backport.yml`](.github/workflows/backport.yml) creates backport PRs automatically when the
original PR with an appropriate label `backport <backport-branch-name>` is merged to main with the backport workflow
run successfully on the PR. For example, if a PR on main needs to be backported to `2.x` branch, add a label
`backport 2.x` to the PR and make sure the backport workflow runs on the PR along with other checks. Once this PR is
The Github workflow in [`backport.yml`](.github/workflows/backport.yml) creates backport PRs automatically when the
original PR with an appropriate label `backport <backport-branch-name>` is merged to main with the backport workflow
run successfully on the PR. For example, if a PR on main needs to be backported to `2.x` branch, add a label
`backport 2.x` to the PR and make sure the backport workflow runs on the PR along with other checks. Once this PR is
merged to main, the workflow will create a backport PR to the `2.x` branch.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
![PRs welcome!](https://img.shields.io/badge/PRs-welcome!-success)

## OpenSearch Neural Search
**OpenSearch Neural Search** is an OpenSearch plugin that adds dense neural retrieval into the OpenSearch ecosystem.
**OpenSearch Neural Search** is an OpenSearch plugin that adds dense neural retrieval into the OpenSearch ecosystem.
The plugin provides the capability for indexing documents and doing neural search on the indexed documents.

## Project Resources
Expand Down
12 changes: 2 additions & 10 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ apply plugin: 'opensearch.pluginzip'
apply plugin: 'jacoco'
apply plugin: "com.diffplug.spotless"
apply plugin: 'io.freefair.lombok'
apply from: 'gradle/formatting.gradle'

def pluginName = 'opensearch-neural-search'
def pluginDescription = 'A plugin that adds dense neural retrieval into the OpenSearch ecosytem'
Expand Down Expand Up @@ -313,16 +314,6 @@ run {
useCluster testClusters.integTest
}

spotless {
java {
removeUnusedImports()
importOrder 'java', 'javax', 'org', 'com'
eclipse().configFile rootProject.file('formatterConfig.xml')
trimTrailingWhitespace()
endWithNewline()
}
}

jacocoTestReport {
dependsOn integTest, test
reports {
Expand All @@ -331,6 +322,7 @@ jacocoTestReport {
}
}

check.dependsOn spotlessCheck
check.dependsOn jacocoTestCoverageVerification
jacocoTestCoverageVerification.dependsOn jacocoTestReport

Expand Down
File renamed without changes.
8 changes: 8 additions & 0 deletions formatter/license-header.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/
32 changes: 32 additions & 0 deletions gradle/formatting.gradle
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
allprojects {
project.apply plugin: "com.diffplug.spotless"
spotless {
java {
// Normally this isn't necessary, but we have Java sources in
// non-standard places
target '**/*.java'

removeUnusedImports()
eclipse().configFile rootProject.file('formatter/formatterConfig.xml')
trimTrailingWhitespace()
endWithNewline();

custom 'Refuse wildcard imports', {
// Wildcard imports can't be resolved; fail the build
if (it =~ /\s+import .*\*;/) {
throw new AssertionError("Do not use wildcard imports. 'spotlessApply' cannot resolve this issue.")
}
}
}
format 'misc', {
target '*.md', '*.gradle', '**/*.json', '**/*.yaml', '**/*.yml', '**/*.svg'

trimTrailingWhitespace()
endWithNewline()
}
format("license", {
licenseHeaderFile("${rootProject.file("formatter/license-header.txt")}", "package ");
target("src/*/java/**/*.java")
})
}
}
2 changes: 1 addition & 1 deletion qa/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -171,4 +171,4 @@ task zipBwcPlugin(type: Zip) {
task bwcTestSuite {
dependsOn ":qa:restart-upgrade:testRestartUpgrade"
dependsOn ":qa:rolling-upgrade:testRollingUpgrade"
}
}
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/

package org.opensearch.neuralsearch.bwc;

import java.util.Locale;
Expand Down Expand Up @@ -44,12 +47,12 @@ protected boolean preserveTemplatesUponCompletion() {
@Override
protected final Settings restClientSettings() {
return Settings.builder()
.put(super.restClientSettings())
// increase the timeout here to 90 seconds to handle long waits for a green
// cluster health. the waits for green need to be longer than a minute to
// account for delayed shards
.put(OpenSearchRestTestCase.CLIENT_SOCKET_TIMEOUT, CLIENT_TIMEOUT_VALUE)
.build();
.put(super.restClientSettings())
// increase the timeout here to 90 seconds to handle long waits for a green
// cluster health. the waits for green need to be longer than a minute to
// account for delayed shards
.put(OpenSearchRestTestCase.CLIENT_SOCKET_TIMEOUT, CLIENT_TIMEOUT_VALUE)
.build();
}

protected static final boolean isRunningAgainstOldCluster() {
Expand All @@ -59,4 +62,4 @@ protected static final boolean isRunningAgainstOldCluster() {
protected final Optional<String> getBWCVersion() {
return Optional.ofNullable(System.getProperty(BWC_VERSION, null));
}
}
}
Original file line number Diff line number Diff line change
@@ -1,62 +1,61 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/

package org.opensearch.neuralsearch.bwc;

import com.carrotsearch.randomizedtesting.RandomizedTest;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.Map;
import java.util.ArrayList;


import static org.opensearch.neuralsearch.TestUtils.NODES_BWC_CLUSTER;

import static org.opensearch.neuralsearch.TestUtils.getModelId;
import static org.opensearch.neuralsearch.TestUtils.TEXT_EMBEDDING_PROCESSOR;
import org.opensearch.neuralsearch.query.NeuralQueryBuilder;


public class SemanticSearch extends AbstractRestartUpgradeRestTestCase{
public class SemanticSearch extends AbstractRestartUpgradeRestTestCase {

private static final String PIPELINE_NAME = "nlp-pipeline";
private static String DOC_ID = "0";
private static final String TEST_FIELD = "passage_text";
private static final String TEXT= "Hello world";
private static final String TEXT = "Hello world";

//Test restart-upgrade Semantic Search
//Create Text Embedding Processor, Ingestion Pipeline and add document
//Validate process , pipeline and document count in restart-upgrade scenario
public void testSemanticSearch() throws Exception{
// Test restart-upgrade Semantic Search
// Create Text Embedding Processor, Ingestion Pipeline and add document
// Validate process , pipeline and document count in restart-upgrade scenario
public void testSemanticSearch() throws Exception {
waitForClusterHealthGreen(NODES_BWC_CLUSTER);

if (isRunningAgainstOldCluster()){
String modelId= uploadTextEmbeddingModel();
if (isRunningAgainstOldCluster()) {
String modelId = uploadTextEmbeddingModel();
loadModel(modelId);
createPipelineProcessor(modelId,PIPELINE_NAME);
createPipelineProcessor(modelId, PIPELINE_NAME);
createIndexWithConfiguration(
testIndex,
Files.readString(Path.of(classLoader.getResource("processor/IndexMappings.json").toURI())),
PIPELINE_NAME
testIndex,
Files.readString(Path.of(classLoader.getResource("processor/IndexMappings.json").toURI())),
PIPELINE_NAME
);
addDocument(testIndex, DOC_ID,TEST_FIELD,TEXT);
}else {
Map<String,Object> pipeline= getIngestionPipeline(PIPELINE_NAME);
addDocument(testIndex, DOC_ID, TEST_FIELD, TEXT);
} else {
Map<String, Object> pipeline = getIngestionPipeline(PIPELINE_NAME);
assertNotNull(pipeline);
String modelId=getModelId(pipeline, TEXT_EMBEDDING_PROCESSOR);
String modelId = getModelId(pipeline, TEXT_EMBEDDING_PROCESSOR);
validateTestIndex(modelId);
deletePipeline(PIPELINE_NAME);
deleteModel(modelId);
deleteIndex(testIndex);
}
}


private void validateTestIndex(String modelId) throws Exception {
int docCount=getDocCount(testIndex);
assertEquals(1,docCount);
int docCount = getDocCount(testIndex);
assertEquals(1, docCount);
loadModel(modelId);
NeuralQueryBuilder neuralQueryBuilder = new NeuralQueryBuilder();
neuralQueryBuilder.fieldName("passage_embedding");
Expand All @@ -67,27 +66,26 @@ private void validateTestIndex(String modelId) throws Exception {
assertNotNull(response);
}


private String uploadTextEmbeddingModel() throws Exception {
String requestBody = Files.readString(Path.of(classLoader.getResource("processor/UploadModelRequestBody.json").toURI()));
return registerModelGroupAndGetModelId(requestBody);
}

private String registerModelGroupAndGetModelId(String requestBody) throws Exception {
String modelGroupRegisterRequestBody = Files.readString(
Path.of(classLoader.getResource("processor/CreateModelGroupRequestBody.json").toURI())
Path.of(classLoader.getResource("processor/CreateModelGroupRequestBody.json").toURI())
).replace("<MODEL_GROUP_NAME>", "public_model_" + RandomizedTest.randomAsciiAlphanumOfLength(8));

String modelGroupId=registerModelGroup(modelGroupRegisterRequestBody);
String modelGroupId = registerModelGroup(modelGroupRegisterRequestBody);

requestBody = requestBody.replace("<MODEL_GROUP_ID>", modelGroupId);

return uploadModelId(requestBody);
}

protected void createPipelineProcessor(String modelId, String pipelineName, ProcessorType processorType) throws Exception {
String requestBody=Files.readString(Path.of(classLoader.getResource("processor/PipelineConfiguration.json").toURI()));
createPipelineProcessor(requestBody,pipelineName,modelId);
String requestBody = Files.readString(Path.of(classLoader.getResource("processor/PipelineConfiguration.json").toURI()));
createPipelineProcessor(requestBody, pipelineName, modelId);
}

}
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
{
"name": "<MODEL_GROUP_NAME>",
"description": "This is a public model group"
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -30,4 +30,3 @@
}
}
}

Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,4 @@
}
}
]
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@
"all_config": "{\"architectures\":[\"BertModel\"],\"max_position_embeddings\":512,\"model_type\":\"bert\",\"num_attention_heads\":12,\"num_hidden_layers\":6}"
},
"url": "https://github.com/opensearch-project/ml-commons/blob/2.x/ml-algorithms/src/test/resources/org/opensearch/ml/engine/algorithms/text_embedding/traced_small_model.zip?raw=true"
}
}
2 changes: 1 addition & 1 deletion qa/rolling-upgrade/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -118,4 +118,4 @@ task testRollingUpgrade(type: StandaloneRestIntegTestTask) {
nonInputProperties.systemProperty('tests.rest.cluster', "${-> testClusters."${baseName}".allHttpSocketURI.join(",")}")
nonInputProperties.systemProperty('tests.clustername', "${-> testClusters."${baseName}".getName()}")
systemProperty 'tests.security.manager', 'false'
}
}
Loading

0 comments on commit 820917b

Please sign in to comment.