diff --git a/.gitattributes b/.gitattributes
new file mode 100644
index 0000000000000..0c4d2c9b20b63
--- /dev/null
+++ b/.gitattributes
@@ -0,0 +1 @@
+CHANGELOG.asciidoc merge=union
diff --git a/.gitignore b/.gitignore
index 41a151f160cfa..8b2a7335ade9d 100644
--- a/.gitignore
+++ b/.gitignore
@@ -20,10 +20,8 @@ nbactions.xml
.gradle/
build/
-# maven stuff (to be removed when trunk becomes 4.x)
-*-execution-hints.log
-target/
-dependency-reduced-pom.xml
+# vscode stuff
+.vscode/
# testing stuff
**/.local*
@@ -43,4 +41,3 @@ html_docs
# random old stuff that we should look at the necessity of...
/tmp/
eclipse-build
-
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 69e90473a7f61..03b2674a4cc8c 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -107,6 +107,8 @@ We support development in the Eclipse and IntelliJ IDEs. For Eclipse, the
minimum version that we support is [Eclipse Oxygen][eclipse] (version 4.7). For
IntelliJ, the minimum version that we support is [IntelliJ 2017.2][intellij].
+### Configuring IDEs And Running Tests
+
Eclipse users can automatically configure their IDE: `./gradlew eclipse`
then `File: Import: Existing Projects into Workspace`. Select the
option `Search for nested projects`. Additionally you will want to
@@ -144,6 +146,9 @@ For IntelliJ, go to
For Eclipse, go to `Preferences->Java->Installed JREs` and add `-ea` to
`VM Arguments`.
+
+### Java Language Formatting Guidelines
+
Please follow these formatting guidelines:
* Java indent is 4 spaces
@@ -155,6 +160,33 @@ Please follow these formatting guidelines:
* IntelliJ: `Preferences/Settings->Editor->Code Style->Java->Imports`. There are two configuration options: `Class count to use import with '*'` and `Names count to use static import with '*'`. Set their values to 99999 or some other absurdly high value.
* Don't worry too much about import order. Try not to change it but don't worry about fighting your IDE to stop it from doing so.
+### License Headers
+
+We require license headers on all Java files. You will notice that all the Java files in
+the top-level `x-pack` directory contain a separate license from the rest of the repository. This
+directory contains commercial code that is associated with a separate license. It can be helpful
+to have the IDE automatically insert the appropriate license header depending which part of the project
+contributions are made to.
+
+#### IntelliJ: Copyright & Scope Profiles
+
+To have IntelliJ insert the correct license, it is necessary to create to copyright profiles.
+These may potentially be called `apache2` and `commercial`. These can be created in
+`Preferences/Settings->Editor->Copyright->Copyright Profiles`. To associate these profiles to
+their respective directories, two "Scopes" will need to be created. These can be created in
+`Preferences/Settings->Appearances & Behavior->Scopes`. When creating scopes, be sure to choose
+the `shared` scope type. Create a scope, `apache2`, with
+the associated pattern of `!file[group:x-pack]:*/`. This pattern will exclude all the files contained in
+the `x-pack` directory. The other scope, `commercial`, will have the inverse pattern of `file[group:x-pack]:*/`.
+The two scopes, together, should account for all the files in the project. To associate the scopes
+with their copyright-profiles, go into `Preferences/Settings->Editor>Copyright` and use the `+` to add
+the associations `apache2/apache2` and `commercial/commercial`.
+
+Configuring these options in IntelliJ can be quite buggy, so do not be alarmed if you have to open/close
+the settings window and/or restart IntelliJ to see your changes take effect.
+
+### Creating A Distribution
+
To create a distribution from the source, simply run:
```sh
@@ -169,6 +201,8 @@ The archive distributions (tar and zip) can be found under:
`./distribution/archives/(tar|zip)/build/distributions/`
+### Running The Full Test Suite
+
Before submitting your changes, run the test suite to make sure that nothing is broken, with:
```sh
diff --git a/LICENSE.txt b/LICENSE.txt
index d645695673349..e601d4382ad6d 100644
--- a/LICENSE.txt
+++ b/LICENSE.txt
@@ -1,202 +1,13 @@
-
- Apache License
- Version 2.0, January 2004
- http://www.apache.org/licenses/
-
- TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
-
- 1. Definitions.
-
- "License" shall mean the terms and conditions for use, reproduction,
- and distribution as defined by Sections 1 through 9 of this document.
-
- "Licensor" shall mean the copyright owner or entity authorized by
- the copyright owner that is granting the License.
-
- "Legal Entity" shall mean the union of the acting entity and all
- other entities that control, are controlled by, or are under common
- control with that entity. For the purposes of this definition,
- "control" means (i) the power, direct or indirect, to cause the
- direction or management of such entity, whether by contract or
- otherwise, or (ii) ownership of fifty percent (50%) or more of the
- outstanding shares, or (iii) beneficial ownership of such entity.
-
- "You" (or "Your") shall mean an individual or Legal Entity
- exercising permissions granted by this License.
-
- "Source" form shall mean the preferred form for making modifications,
- including but not limited to software source code, documentation
- source, and configuration files.
-
- "Object" form shall mean any form resulting from mechanical
- transformation or translation of a Source form, including but
- not limited to compiled object code, generated documentation,
- and conversions to other media types.
-
- "Work" shall mean the work of authorship, whether in Source or
- Object form, made available under the License, as indicated by a
- copyright notice that is included in or attached to the work
- (an example is provided in the Appendix below).
-
- "Derivative Works" shall mean any work, whether in Source or Object
- form, that is based on (or derived from) the Work and for which the
- editorial revisions, annotations, elaborations, or other modifications
- represent, as a whole, an original work of authorship. For the purposes
- of this License, Derivative Works shall not include works that remain
- separable from, or merely link (or bind by name) to the interfaces of,
- the Work and Derivative Works thereof.
-
- "Contribution" shall mean any work of authorship, including
- the original version of the Work and any modifications or additions
- to that Work or Derivative Works thereof, that is intentionally
- submitted to Licensor for inclusion in the Work by the copyright owner
- or by an individual or Legal Entity authorized to submit on behalf of
- the copyright owner. For the purposes of this definition, "submitted"
- means any form of electronic, verbal, or written communication sent
- to the Licensor or its representatives, including but not limited to
- communication on electronic mailing lists, source code control systems,
- and issue tracking systems that are managed by, or on behalf of, the
- Licensor for the purpose of discussing and improving the Work, but
- excluding communication that is conspicuously marked or otherwise
- designated in writing by the copyright owner as "Not a Contribution."
-
- "Contributor" shall mean Licensor and any individual or Legal Entity
- on behalf of whom a Contribution has been received by Licensor and
- subsequently incorporated within the Work.
-
- 2. Grant of Copyright License. Subject to the terms and conditions of
- this License, each Contributor hereby grants to You a perpetual,
- worldwide, non-exclusive, no-charge, royalty-free, irrevocable
- copyright license to reproduce, prepare Derivative Works of,
- publicly display, publicly perform, sublicense, and distribute the
- Work and such Derivative Works in Source or Object form.
-
- 3. Grant of Patent License. Subject to the terms and conditions of
- this License, each Contributor hereby grants to You a perpetual,
- worldwide, non-exclusive, no-charge, royalty-free, irrevocable
- (except as stated in this section) patent license to make, have made,
- use, offer to sell, sell, import, and otherwise transfer the Work,
- where such license applies only to those patent claims licensable
- by such Contributor that are necessarily infringed by their
- Contribution(s) alone or by combination of their Contribution(s)
- with the Work to which such Contribution(s) was submitted. If You
- institute patent litigation against any entity (including a
- cross-claim or counterclaim in a lawsuit) alleging that the Work
- or a Contribution incorporated within the Work constitutes direct
- or contributory patent infringement, then any patent licenses
- granted to You under this License for that Work shall terminate
- as of the date such litigation is filed.
-
- 4. Redistribution. You may reproduce and distribute copies of the
- Work or Derivative Works thereof in any medium, with or without
- modifications, and in Source or Object form, provided that You
- meet the following conditions:
-
- (a) You must give any other recipients of the Work or
- Derivative Works a copy of this License; and
-
- (b) You must cause any modified files to carry prominent notices
- stating that You changed the files; and
-
- (c) You must retain, in the Source form of any Derivative Works
- that You distribute, all copyright, patent, trademark, and
- attribution notices from the Source form of the Work,
- excluding those notices that do not pertain to any part of
- the Derivative Works; and
-
- (d) If the Work includes a "NOTICE" text file as part of its
- distribution, then any Derivative Works that You distribute must
- include a readable copy of the attribution notices contained
- within such NOTICE file, excluding those notices that do not
- pertain to any part of the Derivative Works, in at least one
- of the following places: within a NOTICE text file distributed
- as part of the Derivative Works; within the Source form or
- documentation, if provided along with the Derivative Works; or,
- within a display generated by the Derivative Works, if and
- wherever such third-party notices normally appear. The contents
- of the NOTICE file are for informational purposes only and
- do not modify the License. You may add Your own attribution
- notices within Derivative Works that You distribute, alongside
- or as an addendum to the NOTICE text from the Work, provided
- that such additional attribution notices cannot be construed
- as modifying the License.
-
- You may add Your own copyright statement to Your modifications and
- may provide additional or different license terms and conditions
- for use, reproduction, or distribution of Your modifications, or
- for any such Derivative Works as a whole, provided Your use,
- reproduction, and distribution of the Work otherwise complies with
- the conditions stated in this License.
-
- 5. Submission of Contributions. Unless You explicitly state otherwise,
- any Contribution intentionally submitted for inclusion in the Work
- by You to the Licensor shall be under the terms and conditions of
- this License, without any additional terms or conditions.
- Notwithstanding the above, nothing herein shall supersede or modify
- the terms of any separate license agreement you may have executed
- with Licensor regarding such Contributions.
-
- 6. Trademarks. This License does not grant permission to use the trade
- names, trademarks, service marks, or product names of the Licensor,
- except as required for reasonable and customary use in describing the
- origin of the Work and reproducing the content of the NOTICE file.
-
- 7. Disclaimer of Warranty. Unless required by applicable law or
- agreed to in writing, Licensor provides the Work (and each
- Contributor provides its Contributions) on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
- implied, including, without limitation, any warranties or conditions
- of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
- PARTICULAR PURPOSE. You are solely responsible for determining the
- appropriateness of using or redistributing the Work and assume any
- risks associated with Your exercise of permissions under this License.
-
- 8. Limitation of Liability. In no event and under no legal theory,
- whether in tort (including negligence), contract, or otherwise,
- unless required by applicable law (such as deliberate and grossly
- negligent acts) or agreed to in writing, shall any Contributor be
- liable to You for damages, including any direct, indirect, special,
- incidental, or consequential damages of any character arising as a
- result of this License or out of the use or inability to use the
- Work (including but not limited to damages for loss of goodwill,
- work stoppage, computer failure or malfunction, or any and all
- other commercial damages or losses), even if such Contributor
- has been advised of the possibility of such damages.
-
- 9. Accepting Warranty or Additional Liability. While redistributing
- the Work or Derivative Works thereof, You may choose to offer,
- and charge a fee for, acceptance of support, warranty, indemnity,
- or other liability obligations and/or rights consistent with this
- License. However, in accepting such obligations, You may act only
- on Your own behalf and on Your sole responsibility, not on behalf
- of any other Contributor, and only if You agree to indemnify,
- defend, and hold each Contributor harmless for any liability
- incurred by, or claims asserted against, such Contributor by reason
- of your accepting any such warranty or additional liability.
-
- END OF TERMS AND CONDITIONS
-
- APPENDIX: How to apply the Apache License to your work.
-
- To apply the Apache License to your work, attach the following
- boilerplate notice, with the fields enclosed by brackets "[]"
- replaced with your own identifying information. (Don't include
- the brackets!) The text should be enclosed in the appropriate
- comment syntax for the file format. We also recommend that a
- file or class name and description of purpose be included on the
- same "printed page" as the copyright notice for easier
- identification within third-party archives.
-
- Copyright [yyyy] [name of copyright owner]
-
- Licensed under the Apache License, Version 2.0 (the "License");
- you may not use this file except in compliance with the License.
- You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
+Source code in this repository is variously licensed under the Apache License
+Version 2.0, an Apache compatible license, or the Elastic License. Outside of
+the "x-pack" folder, source code in a given file is licensed under the Apache
+License Version 2.0, unless otherwise noted at the beginning of the file or a
+LICENSE file present in the directory subtree declares a separate license.
+Within the "x-pack" folder, source code in a given file is licensed under the
+Elastic License, unless otherwise noted at the beginning of the file or a
+LICENSE file present in the directory subtree declares a separate license.
+
+The build produces two sets of binaries - one set that falls under the Elastic
+License and another set that falls under Apache License Version 2.0. The
+binaries that contain `-oss` in the artifact name are licensed under the Apache
+License Version 2.0.
diff --git a/build.gradle b/build.gradle
index 8218d49fd68ff..c538c0cb898ef 100644
--- a/build.gradle
+++ b/build.gradle
@@ -20,6 +20,7 @@
import org.apache.tools.ant.taskdefs.condition.Os
import org.elasticsearch.gradle.BuildPlugin
+import org.elasticsearch.gradle.LoggedExec
import org.elasticsearch.gradle.Version
import org.elasticsearch.gradle.VersionCollection
import org.elasticsearch.gradle.VersionProperties
@@ -30,6 +31,7 @@ import org.gradle.api.tasks.wrapper.Wrapper.DistributionType
import org.gradle.util.GradleVersion
import org.gradle.util.DistributionLocator
+import java.nio.file.Files
import java.nio.file.Path
import java.security.MessageDigest
@@ -40,10 +42,9 @@ subprojects {
description = "Elasticsearch subproject ${project.path}"
}
-Path rootPath = rootDir.toPath()
-// setup pom license info, but only for artifacts that are part of elasticsearch
-configure(subprojects.findAll { it.projectDir.toPath().startsWith(rootPath) }) {
-
+subprojects {
+ project.ext.licenseName = 'The Apache Software License, Version 2.0'
+ project.ext.licenseUrl = 'http://www.apache.org/licenses/LICENSE-2.0.txt'
// we only use maven publish to add tasks for pom generation
plugins.withType(MavenPublishPlugin).whenPluginAdded {
publishing {
@@ -55,8 +56,8 @@ configure(subprojects.findAll { it.projectDir.toPath().startsWith(rootPath) }) {
node.appendNode('inceptionYear', '2009')
Node license = node.appendNode('licenses').appendNode('license')
- license.appendNode('name', 'The Apache Software License, Version 2.0')
- license.appendNode('url', 'http://www.apache.org/licenses/LICENSE-2.0.txt')
+ license.appendNode('name', project.licenseName)
+ license.appendNode('url', project.licenseUrl)
license.appendNode('distribution', 'repo')
Node developer = node.appendNode('developers').appendNode('developer')
@@ -68,7 +69,7 @@ configure(subprojects.findAll { it.projectDir.toPath().startsWith(rootPath) }) {
}
}
plugins.withType(BuildPlugin).whenPluginAdded {
- project.licenseFile = project.rootProject.file('LICENSE.txt')
+ project.licenseFile = project.rootProject.file('licenses/APACHE-LICENSE-2.0.txt')
project.noticeFile = project.rootProject.file('NOTICE.txt')
}
}
@@ -206,9 +207,13 @@ subprojects {
"org.elasticsearch.test:framework:${version}": ':test:framework',
"org.elasticsearch.distribution.integ-test-zip:elasticsearch:${version}": ':distribution:archives:integ-test-zip',
"org.elasticsearch.distribution.zip:elasticsearch:${version}": ':distribution:archives:zip',
+ "org.elasticsearch.distribution.zip:elasticsearch-oss:${version}": ':distribution:archives:oss-zip',
"org.elasticsearch.distribution.tar:elasticsearch:${version}": ':distribution:archives:tar',
+ "org.elasticsearch.distribution.tar:elasticsearch-oss:${version}": ':distribution:archives:oss-tar',
"org.elasticsearch.distribution.rpm:elasticsearch:${version}": ':distribution:packages:rpm',
+ "org.elasticsearch.distribution.rpm:elasticsearch-oss:${version}": ':distribution:packages:oss-rpm',
"org.elasticsearch.distribution.deb:elasticsearch:${version}": ':distribution:packages:deb',
+ "org.elasticsearch.distribution.deb:elasticsearch-oss:${version}": ':distribution:packages:oss-deb',
"org.elasticsearch.test:logger-usage:${version}": ':test:logger-usage',
// for transport client
"org.elasticsearch.plugin:transport-netty4-client:${version}": ':modules:transport-netty4',
@@ -228,6 +233,11 @@ subprojects {
ext.projectSubstitutions["org.elasticsearch.distribution.deb:elasticsearch:${snapshot}"] = snapshotProject
ext.projectSubstitutions["org.elasticsearch.distribution.rpm:elasticsearch:${snapshot}"] = snapshotProject
ext.projectSubstitutions["org.elasticsearch.distribution.zip:elasticsearch:${snapshot}"] = snapshotProject
+ if (snapshot.onOrAfter('6.3.0')) {
+ ext.projectSubstitutions["org.elasticsearch.distribution.deb:elasticsearch-oss:${snapshot}"] = snapshotProject
+ ext.projectSubstitutions["org.elasticsearch.distribution.rpm:elasticsearch-oss:${snapshot}"] = snapshotProject
+ ext.projectSubstitutions["org.elasticsearch.distribution.zip:elasticsearch-oss:${snapshot}"] = snapshotProject
+ }
}
}
@@ -451,6 +461,59 @@ gradle.projectsEvaluated {
}
+static void assertLinesInFile(final Path path, final List expectedLines) {
+ final List actualLines = Files.readAllLines(path)
+ int line = 0
+ for (final String expectedLine : expectedLines) {
+ final String actualLine = actualLines.get(line)
+ if (expectedLine != actualLine) {
+ throw new GradleException("expected line [${line + 1}] in [${path}] to be [${expectedLine}] but was [${actualLine}]")
+ }
+ line++
+ }
+}
+
+/*
+ * Check that all generated JARs have our NOTICE.txt and an appropriate
+ * LICENSE.txt in them. We configurate this in gradle but we'd like to
+ * be extra paranoid.
+ */
+subprojects { project ->
+ project.tasks.withType(Jar).whenTaskAdded { jarTask ->
+ final Task extract = project.task("extract${jarTask.name.capitalize()}", type: LoggedExec) {
+ dependsOn jarTask
+ ext.destination = project.buildDir.toPath().resolve("jar-extracted/${jarTask.name}")
+ commandLine "${->new File(rootProject.compilerJavaHome, 'bin/jar')}",
+ 'xf', "${-> jarTask.outputs.files.singleFile}", 'META-INF/LICENSE.txt', 'META-INF/NOTICE.txt'
+ workingDir destination
+ doFirst {
+ project.delete(destination)
+ Files.createDirectories(destination)
+ }
+ }
+
+ final Task checkNotice = project.task("verify${jarTask.name.capitalize()}Notice") {
+ dependsOn extract
+ doLast {
+ final List noticeLines = Files.readAllLines(project.noticeFile.toPath())
+ final Path noticePath = extract.destination.resolve('META-INF/NOTICE.txt')
+ assertLinesInFile(noticePath, noticeLines)
+ }
+ }
+ project.check.dependsOn checkNotice
+
+ final Task checkLicense = project.task("verify${jarTask.name.capitalize()}License") {
+ dependsOn extract
+ doLast {
+ final List licenseLines = Files.readAllLines(project.licenseFile.toPath())
+ final Path licensePath = extract.destination.resolve('META-INF/LICENSE.txt')
+ assertLinesInFile(licensePath, licenseLines)
+ }
+ }
+ project.check.dependsOn checkLicense
+ }
+}
+
/* Remove assemble on all qa projects because we don't need to publish
* artifacts for them. */
gradle.projectsEvaluated {
diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/BuildPlugin.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/BuildPlugin.groovy
index 3103f23472ed7..a44b9c849d333 100644
--- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/BuildPlugin.groovy
+++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/BuildPlugin.groovy
@@ -38,6 +38,7 @@ import org.gradle.api.artifacts.ModuleVersionIdentifier
import org.gradle.api.artifacts.ProjectDependency
import org.gradle.api.artifacts.ResolvedArtifact
import org.gradle.api.artifacts.dsl.RepositoryHandler
+import org.gradle.api.execution.TaskExecutionGraph
import org.gradle.api.plugins.JavaPlugin
import org.gradle.api.publish.maven.MavenPublication
import org.gradle.api.publish.maven.plugins.MavenPublishPlugin
@@ -221,21 +222,34 @@ class BuildPlugin implements Plugin {
return System.getenv('JAVA' + version + '_HOME')
}
- /**
- * Get Java home for the project for the specified version. If the specified version is not configured, an exception with the specified
- * message is thrown.
- *
- * @param project the project
- * @param version the version of Java home to obtain
- * @param message the exception message if Java home for the specified version is not configured
- * @return Java home for the specified version
- * @throws GradleException if Java home for the specified version is not configured
- */
- static String getJavaHome(final Project project, final int version, final String message) {
- if (project.javaVersions.get(version) == null) {
- throw new GradleException(message)
+ /** Add a check before gradle execution phase which ensures java home for the given java version is set. */
+ static void requireJavaHome(Task task, int version) {
+ Project rootProject = task.project.rootProject // use root project for global accounting
+ if (rootProject.hasProperty('requiredJavaVersions') == false) {
+ rootProject.rootProject.ext.requiredJavaVersions = [:].withDefault{key -> return []}
+ rootProject.gradle.taskGraph.whenReady { TaskExecutionGraph taskGraph ->
+ List messages = []
+ for (entry in rootProject.requiredJavaVersions) {
+ if (rootProject.javaVersions.get(entry.key) != null) {
+ continue
+ }
+ List tasks = entry.value.findAll { taskGraph.hasTask(it) }.collect { " ${it.path}" }
+ if (tasks.isEmpty() == false) {
+ messages.add("JAVA${entry.key}_HOME required to run tasks:\n${tasks.join('\n')}")
+ }
+ }
+ if (messages.isEmpty() == false) {
+ throw new GradleException(messages.join('\n'))
+ }
+ }
}
- return project.javaVersions.get(version)
+ rootProject.requiredJavaVersions.get(version).add(task)
+ }
+
+ /** A convenience method for getting java home for a version of java and requiring that version for the given task to execute */
+ static String getJavaHome(final Task task, final int version) {
+ requireJavaHome(task, version)
+ return task.project.javaVersions.get(version)
}
private static String findRuntimeJavaHome(final String compilerJavaHome) {
@@ -605,6 +619,7 @@ class BuildPlugin implements Plugin {
jarTask.metaInf {
from(project.licenseFile.parent) {
include project.licenseFile.name
+ rename { 'LICENSE.txt' }
}
from(project.noticeFile.parent) {
include project.noticeFile.name
diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/plugin/MetaPluginBuildPlugin.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/plugin/MetaPluginBuildPlugin.groovy
index 6c1857b3e7bf9..acb8f57d9d72c 100644
--- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/plugin/MetaPluginBuildPlugin.groovy
+++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/plugin/MetaPluginBuildPlugin.groovy
@@ -37,10 +37,11 @@ class MetaPluginBuildPlugin implements Plugin {
project.plugins.apply(RestTestPlugin)
createBundleTask(project)
- boolean isModule = project.path.startsWith(':modules:')
+ boolean isModule = project.path.startsWith(':modules:') || project.path.startsWith(':x-pack:plugin')
project.integTestCluster {
dependsOn(project.bundlePlugin)
+ distribution = 'integ-test-zip'
}
BuildPlugin.configurePomGeneration(project)
project.afterEvaluate {
@@ -49,9 +50,9 @@ class MetaPluginBuildPlugin implements Plugin {
if (project.integTestCluster.distribution == 'integ-test-zip') {
project.integTestCluster.module(project)
}
- } else {
+ } else {
project.integTestCluster.plugin(project.path)
- }
+ }
}
RunTask run = project.tasks.create('run', RunTask)
diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/plugin/PluginBuildPlugin.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/plugin/PluginBuildPlugin.groovy
index 80cb376077ed1..28008f4313c97 100644
--- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/plugin/PluginBuildPlugin.groovy
+++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/plugin/PluginBuildPlugin.groovy
@@ -50,7 +50,8 @@ public class PluginBuildPlugin extends BuildPlugin {
// this afterEvaluate must happen before the afterEvaluate added by integTest creation,
// so that the file name resolution for installing the plugin will be setup
project.afterEvaluate {
- boolean isModule = project.path.startsWith(':modules:')
+ boolean isXPackModule = project.path.startsWith(':x-pack:plugin')
+ boolean isModule = project.path.startsWith(':modules:') || isXPackModule
String name = project.pluginProperties.extension.name
project.archivesBaseName = name
@@ -70,9 +71,13 @@ public class PluginBuildPlugin extends BuildPlugin {
if (isModule) {
project.integTestCluster.module(project)
project.tasks.run.clusterConfig.module(project)
+ project.tasks.run.clusterConfig.distribution = 'integ-test-zip'
} else {
project.integTestCluster.plugin(project.path)
project.tasks.run.clusterConfig.plugin(project.path)
+ }
+
+ if (isModule == false || isXPackModule) {
addZipPomGeneration(project)
addNoticeGeneration(project)
}
@@ -256,6 +261,7 @@ public class PluginBuildPlugin extends BuildPlugin {
if (licenseFile != null) {
project.bundlePlugin.from(licenseFile.parentFile) {
include(licenseFile.name)
+ rename { 'LICENSE.txt' }
}
}
File noticeFile = project.pluginProperties.extension.noticeFile
diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/ClusterFormationTasks.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/ClusterFormationTasks.groovy
index 5f9e4c49b34e9..ed066ddc96baa 100644
--- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/ClusterFormationTasks.groovy
+++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/ClusterFormationTasks.groovy
@@ -20,6 +20,7 @@ package org.elasticsearch.gradle.test
import org.apache.tools.ant.DefaultLogger
import org.apache.tools.ant.taskdefs.condition.Os
+import org.elasticsearch.gradle.BuildPlugin
import org.elasticsearch.gradle.LoggedExec
import org.elasticsearch.gradle.Version
import org.elasticsearch.gradle.VersionProperties
@@ -130,13 +131,22 @@ class ClusterFormationTasks {
/** Adds a dependency on the given distribution */
static void configureDistributionDependency(Project project, String distro, Configuration configuration, Version elasticsearchVersion) {
+ if (elasticsearchVersion.before('6.3.0') && distro.startsWith('oss-')) {
+ distro = distro.substring('oss-'.length())
+ }
String packaging = distro
- if (distro == 'tar') {
- packaging = 'tar.gz'
- } else if (distro == 'integ-test-zip') {
+ if (distro.contains('tar')) {
+ packaging = 'tar.gz'\
+ } else if (distro.contains('zip')) {
packaging = 'zip'
}
- project.dependencies.add(configuration.name, "org.elasticsearch.distribution.${distro}:elasticsearch:${elasticsearchVersion}@${packaging}")
+ String subgroup = distro
+ String artifactName = 'elasticsearch'
+ if (distro.contains('oss')) {
+ artifactName += '-oss'
+ subgroup = distro.substring('oss-'.length())
+ }
+ project.dependencies.add(configuration.name, "org.elasticsearch.distribution.${subgroup}:${artifactName}:${elasticsearchVersion}@${packaging}")
}
/** Adds a dependency on a different version of the given plugin, which will be retrieved using gradle's dependency resolution */
@@ -259,6 +269,7 @@ class ClusterFormationTasks {
switch (node.config.distribution) {
case 'integ-test-zip':
case 'zip':
+ case 'oss-zip':
extract = project.tasks.create(name: name, type: Copy, dependsOn: extractDependsOn) {
from {
project.zipTree(configuration.singleFile)
@@ -267,6 +278,7 @@ class ClusterFormationTasks {
}
break;
case 'tar':
+ case 'oss-tar':
extract = project.tasks.create(name: name, type: Copy, dependsOn: extractDependsOn) {
from {
project.tarTree(project.resources.gzip(configuration.singleFile))
@@ -551,16 +563,17 @@ class ClusterFormationTasks {
/** Adds a task to execute a command to help setup the cluster */
static Task configureExecTask(String name, Project project, Task setup, NodeInfo node, Object[] execArgs) {
- return project.tasks.create(name: name, type: LoggedExec, dependsOn: setup) {
- workingDir node.cwd
+ return project.tasks.create(name: name, type: LoggedExec, dependsOn: setup) { Exec exec ->
+ exec.workingDir node.cwd
+ exec.environment 'JAVA_HOME', node.getJavaHome()
if (Os.isFamily(Os.FAMILY_WINDOWS)) {
- executable 'cmd'
- args '/C', 'call'
+ exec.executable 'cmd'
+ exec.args '/C', 'call'
// On Windows the comma character is considered a parameter separator:
// argument are wrapped in an ExecArgWrapper that escapes commas
- args execArgs.collect { a -> new EscapeCommaWrapper(arg: a) }
+ exec.args execArgs.collect { a -> new EscapeCommaWrapper(arg: a) }
} else {
- commandLine execArgs
+ exec.commandLine execArgs
}
}
}
@@ -607,6 +620,9 @@ class ClusterFormationTasks {
}
Task start = project.tasks.create(name: name, type: DefaultTask, dependsOn: setup)
+ if (node.javaVersion != null) {
+ BuildPlugin.requireJavaHome(start, node.javaVersion)
+ }
start.doLast(elasticsearchRunner)
return start
}
diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/NodeInfo.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/NodeInfo.groovy
index 1fc944eeec6eb..5e67dfa55cfd4 100644
--- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/NodeInfo.groovy
+++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/NodeInfo.groovy
@@ -36,6 +36,9 @@ import static org.elasticsearch.gradle.BuildPlugin.getJavaHome
* A container for the files and configuration associated with a single node in a test cluster.
*/
class NodeInfo {
+ /** Gradle project this node is part of */
+ Project project
+
/** common configuration for all nodes, including this one */
ClusterConfiguration config
@@ -84,6 +87,9 @@ class NodeInfo {
/** directory to install plugins from */
File pluginsTmpDir
+ /** Major version of java this node runs with, or {@code null} if using the runtime java version */
+ Integer javaVersion
+
/** environment variables to start the node with */
Map env
@@ -109,6 +115,7 @@ class NodeInfo {
NodeInfo(ClusterConfiguration config, int nodeNum, Project project, String prefix, Version nodeVersion, File sharedDir) {
this.config = config
this.nodeNum = nodeNum
+ this.project = project
this.sharedDir = sharedDir
if (config.clusterName != null) {
clusterName = config.clusterName
@@ -165,12 +172,11 @@ class NodeInfo {
args.add("${esScript}")
}
+
if (nodeVersion.before("6.2.0")) {
- env = ['JAVA_HOME': "${-> getJavaHome(project, 8, "JAVA8_HOME must be set to run BWC tests against [" + nodeVersion + "]")}"]
+ javaVersion = 8
} else if (nodeVersion.onOrAfter("6.2.0") && nodeVersion.before("6.3.0")) {
- env = ['JAVA_HOME': "${-> getJavaHome(project, 9, "JAVA9_HOME must be set to run BWC tests against [" + nodeVersion + "]")}"]
- } else {
- env = ['JAVA_HOME': (String) project.runtimeJavaHome]
+ javaVersion = 9
}
args.addAll("-E", "node.portsfile=true")
@@ -182,7 +188,7 @@ class NodeInfo {
// in the cluster-specific options
esJavaOpts = String.join(" ", "-ea", "-esa", esJavaOpts)
}
- env.put('ES_JAVA_OPTS', esJavaOpts)
+ env = ['ES_JAVA_OPTS': esJavaOpts]
for (Map.Entry property : System.properties.entrySet()) {
if (property.key.startsWith('tests.es.')) {
args.add("-E")
@@ -242,6 +248,11 @@ class NodeInfo {
return Native.toString(shortPath).substring(4)
}
+ /** Return the java home used by this node. */
+ String getJavaHome() {
+ return javaVersion == null ? project.runtimeJavaHome : project.javaVersions.get(javaVersion)
+ }
+
/** Returns debug string for the command that started this node. */
String getCommandString() {
String esCommandString = "\nNode ${nodeNum} configuration:\n"
@@ -249,6 +260,7 @@ class NodeInfo {
esCommandString += "| cwd: ${cwd}\n"
esCommandString += "| command: ${executable} ${args.join(' ')}\n"
esCommandString += '| environment:\n'
+ esCommandString += "| JAVA_HOME: ${javaHome}\n"
env.each { k, v -> esCommandString += "| ${k}: ${v}\n" }
if (config.daemonize) {
esCommandString += "|\n| [${wrapperScript.name}]\n"
@@ -300,6 +312,8 @@ class NodeInfo {
case 'integ-test-zip':
case 'zip':
case 'tar':
+ case 'oss-zip':
+ case 'oss-tar':
path = "elasticsearch-${nodeVersion}"
break
case 'rpm':
@@ -316,7 +330,9 @@ class NodeInfo {
switch (distro) {
case 'integ-test-zip':
case 'zip':
+ case 'oss-zip':
case 'tar':
+ case 'oss-tar':
return new File(homeDir(baseDir, distro, nodeVersion), 'config')
case 'rpm':
case 'deb':
diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/RestIntegTestTask.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/RestIntegTestTask.groovy
index 3c7554453b5e2..242ed45eee86e 100644
--- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/RestIntegTestTask.groovy
+++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/test/RestIntegTestTask.groovy
@@ -24,6 +24,7 @@ import org.elasticsearch.gradle.VersionProperties
import org.gradle.api.DefaultTask
import org.gradle.api.Project
import org.gradle.api.Task
+import org.gradle.api.Transformer
import org.gradle.api.execution.TaskExecutionAdapter
import org.gradle.api.internal.tasks.options.Option
import org.gradle.api.provider.Property
diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/vagrant/VagrantPropertiesExtension.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/vagrant/VagrantPropertiesExtension.groovy
index c6d0f1d0425d0..264a1e0f8ac17 100644
--- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/vagrant/VagrantPropertiesExtension.groovy
+++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/vagrant/VagrantPropertiesExtension.groovy
@@ -18,6 +18,7 @@
*/
package org.elasticsearch.gradle.vagrant
+import org.elasticsearch.gradle.Version
import org.gradle.api.tasks.Input
class VagrantPropertiesExtension {
@@ -26,7 +27,7 @@ class VagrantPropertiesExtension {
List boxes
@Input
- String upgradeFromVersion
+ Version upgradeFromVersion
@Input
List upgradeFromVersions
diff --git a/buildSrc/src/main/groovy/org/elasticsearch/gradle/vagrant/VagrantTestPlugin.groovy b/buildSrc/src/main/groovy/org/elasticsearch/gradle/vagrant/VagrantTestPlugin.groovy
index d7d1c01e7dd00..7a0b9f96781df 100644
--- a/buildSrc/src/main/groovy/org/elasticsearch/gradle/vagrant/VagrantTestPlugin.groovy
+++ b/buildSrc/src/main/groovy/org/elasticsearch/gradle/vagrant/VagrantTestPlugin.groovy
@@ -3,6 +3,7 @@ package org.elasticsearch.gradle.vagrant
import org.apache.tools.ant.taskdefs.condition.Os
import org.elasticsearch.gradle.FileContentsTask
import org.elasticsearch.gradle.LoggedExec
+import org.elasticsearch.gradle.Version
import org.gradle.api.*
import org.gradle.api.artifacts.dsl.RepositoryHandler
import org.gradle.api.execution.TaskExecutionAdapter
@@ -36,8 +37,15 @@ class VagrantTestPlugin implements Plugin {
'ubuntu-1404',
]
- /** All onboarded archives by default, available for Bats tests even if not used **/
- static List DISTRIBUTION_ARCHIVES = ['tar', 'rpm', 'deb']
+ /** All distributions to bring into test VM, whether or not they are used **/
+ static List DISTRIBUTIONS = [
+ 'archives:tar',
+ 'archives:oss-tar',
+ 'packages:rpm',
+ 'packages:oss-rpm',
+ 'packages:deb',
+ 'packages:oss-deb'
+ ]
/** Packages onboarded for upgrade tests **/
static List UPGRADE_FROM_ARCHIVES = ['rpm', 'deb']
@@ -105,21 +113,19 @@ class VagrantTestPlugin implements Plugin {
private static void createPackagingConfiguration(Project project) {
project.configurations.create(PACKAGING_CONFIGURATION)
- String upgradeFromVersion = System.getProperty("tests.packaging.upgradeVersion")
- if (upgradeFromVersion == null) {
+ String upgradeFromVersionRaw = System.getProperty("tests.packaging.upgradeVersion");
+ Version upgradeFromVersion
+ if (upgradeFromVersionRaw == null) {
String firstPartOfSeed = project.rootProject.testSeed.tokenize(':').get(0)
final long seed = Long.parseUnsignedLong(firstPartOfSeed, 16)
final def indexCompatVersions = project.bwcVersions.indexCompatible
upgradeFromVersion = indexCompatVersions[new Random(seed).nextInt(indexCompatVersions.size())]
+ } else {
+ upgradeFromVersion = Version.fromString(upgradeFromVersionRaw)
}
- DISTRIBUTION_ARCHIVES.each {
+ DISTRIBUTIONS.each {
// Adds a dependency for the current version
- if (it == 'tar') {
- it = 'archives:tar'
- } else {
- it = "packages:${it}"
- }
project.dependencies.add(PACKAGING_CONFIGURATION,
project.dependencies.project(path: ":distribution:${it}", configuration: 'default'))
}
@@ -128,6 +134,10 @@ class VagrantTestPlugin implements Plugin {
// The version of elasticsearch that we upgrade *from*
project.dependencies.add(PACKAGING_CONFIGURATION,
"org.elasticsearch.distribution.${it}:elasticsearch:${upgradeFromVersion}@${it}")
+ if (upgradeFromVersion.onOrAfter('6.3.0')) {
+ project.dependencies.add(PACKAGING_CONFIGURATION,
+ "org.elasticsearch.distribution.${it}:elasticsearch-oss:${upgradeFromVersion}@${it}")
+ }
}
project.extensions.esvagrant.upgradeFromVersion = upgradeFromVersion
@@ -173,7 +183,17 @@ class VagrantTestPlugin implements Plugin {
Task createUpgradeFromFile = project.tasks.create('createUpgradeFromFile', FileContentsTask) {
dependsOn copyPackagingArchives
file "${archivesDir}/upgrade_from_version"
- contents project.extensions.esvagrant.upgradeFromVersion
+ contents project.extensions.esvagrant.upgradeFromVersion.toString()
+ }
+
+ Task createUpgradeIsOssFile = project.tasks.create('createUpgradeIsOssFile', FileContentsTask) {
+ dependsOn copyPackagingArchives
+ doFirst {
+ project.delete("${archivesDir}/upgrade_is_oss")
+ }
+ onlyIf { project.extensions.esvagrant.upgradeFromVersion.onOrAfter('6.3.0') }
+ file "${archivesDir}/upgrade_is_oss"
+ contents ''
}
File batsDir = new File(packagingDir, BATS)
@@ -214,7 +234,7 @@ class VagrantTestPlugin implements Plugin {
Task vagrantSetUpTask = project.tasks.create('setupPackagingTest')
vagrantSetUpTask.dependsOn 'vagrantCheckVersion'
- vagrantSetUpTask.dependsOn copyPackagingArchives, createVersionFile, createUpgradeFromFile
+ vagrantSetUpTask.dependsOn copyPackagingArchives, createVersionFile, createUpgradeFromFile, createUpgradeIsOssFile
vagrantSetUpTask.dependsOn copyBatsTests, copyBatsUtils
}
diff --git a/buildSrc/src/main/resources/checkstyle_suppressions.xml b/buildSrc/src/main/resources/checkstyle_suppressions.xml
index 2aa72f0fa7a1c..609a7cf2ea66f 100644
--- a/buildSrc/src/main/resources/checkstyle_suppressions.xml
+++ b/buildSrc/src/main/resources/checkstyle_suppressions.xml
@@ -535,7 +535,6 @@
-
diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/Request.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/Request.java
index 4e6fcdbb8dd4a..d68d3b309af51 100644
--- a/client/rest-high-level/src/main/java/org/elasticsearch/client/Request.java
+++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/Request.java
@@ -48,6 +48,7 @@
import org.elasticsearch.action.admin.indices.shrink.ResizeType;
import org.elasticsearch.action.bulk.BulkRequest;
import org.elasticsearch.action.delete.DeleteRequest;
+import org.elasticsearch.action.fieldcaps.FieldCapabilitiesRequest;
import org.elasticsearch.action.get.GetRequest;
import org.elasticsearch.action.get.MultiGetRequest;
import org.elasticsearch.action.index.IndexRequest;
@@ -75,6 +76,7 @@
import org.elasticsearch.common.xcontent.XContentType;
import org.elasticsearch.index.VersionType;
import org.elasticsearch.index.rankeval.RankEvalRequest;
+import org.elasticsearch.rest.action.RestFieldCapabilitiesAction;
import org.elasticsearch.rest.action.search.RestSearchAction;
import org.elasticsearch.search.fetch.subphase.FetchSourceContext;
@@ -536,6 +538,16 @@ static Request existsAlias(GetAliasesRequest getAliasesRequest) {
return new Request(HttpHead.METHOD_NAME, endpoint, params.getParams(), null);
}
+ static Request fieldCaps(FieldCapabilitiesRequest fieldCapabilitiesRequest) {
+ Params params = Params.builder();
+ params.withFields(fieldCapabilitiesRequest.fields());
+ params.withIndicesOptions(fieldCapabilitiesRequest.indicesOptions());
+
+ String[] indices = fieldCapabilitiesRequest.indices();
+ String endpoint = endpoint(indices, "_field_caps");
+ return new Request(HttpGet.METHOD_NAME, endpoint, params.getParams(), null);
+ }
+
static Request rankEval(RankEvalRequest rankEvalRequest) throws IOException {
String endpoint = endpoint(rankEvalRequest.indices(), Strings.EMPTY_ARRAY, "_rank_eval");
Params params = Params.builder();
@@ -572,7 +584,6 @@ private static Request resize(ResizeRequest resizeRequest) throws IOException {
static Request clusterPutSettings(ClusterUpdateSettingsRequest clusterUpdateSettingsRequest) throws IOException {
Params parameters = Params.builder();
- parameters.withFlatSettings(clusterUpdateSettingsRequest.flatSettings());
parameters.withTimeout(clusterUpdateSettingsRequest.timeout());
parameters.withMasterTimeout(clusterUpdateSettingsRequest.masterNodeTimeout());
HttpEntity entity = createEntity(clusterUpdateSettingsRequest, REQUEST_BODY_CONTENT_TYPE);
@@ -603,7 +614,6 @@ static Request indicesExist(GetIndexRequest request) {
params.withLocal(request.local());
params.withHuman(request.humanReadable());
params.withIndicesOptions(request.indicesOptions());
- params.withFlatSettings(request.flatSettings());
params.withIncludeDefaults(request.includeDefaults());
return new Request(HttpHead.METHOD_NAME, endpoint, params.getParams(), null);
}
@@ -613,7 +623,6 @@ static Request indexPutSettings(UpdateSettingsRequest updateSettingsRequest) thr
parameters.withTimeout(updateSettingsRequest.timeout());
parameters.withMasterTimeout(updateSettingsRequest.masterNodeTimeout());
parameters.withIndicesOptions(updateSettingsRequest.indicesOptions());
- parameters.withFlatSettings(updateSettingsRequest.flatSettings());
parameters.withPreserveExisting(updateSettingsRequest.isPreserveExisting());
String[] indices = updateSettingsRequest.indices() == null ? Strings.EMPTY_ARRAY : updateSettingsRequest.indices();
@@ -715,6 +724,13 @@ Params withFetchSourceContext(FetchSourceContext fetchSourceContext) {
return this;
}
+ Params withFields(String[] fields) {
+ if (fields != null && fields.length > 0) {
+ return putParam("fields", String.join(",", fields));
+ }
+ return this;
+ }
+
Params withMasterTimeout(TimeValue masterTimeout) {
return putParam("master_timeout", masterTimeout);
}
diff --git a/client/rest-high-level/src/main/java/org/elasticsearch/client/RestHighLevelClient.java b/client/rest-high-level/src/main/java/org/elasticsearch/client/RestHighLevelClient.java
index bf80aa7720741..c6d5e947f2c62 100644
--- a/client/rest-high-level/src/main/java/org/elasticsearch/client/RestHighLevelClient.java
+++ b/client/rest-high-level/src/main/java/org/elasticsearch/client/RestHighLevelClient.java
@@ -30,6 +30,8 @@
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.action.delete.DeleteRequest;
import org.elasticsearch.action.delete.DeleteResponse;
+import org.elasticsearch.action.fieldcaps.FieldCapabilitiesRequest;
+import org.elasticsearch.action.fieldcaps.FieldCapabilitiesResponse;
import org.elasticsearch.action.get.GetRequest;
import org.elasticsearch.action.get.GetResponse;
import org.elasticsearch.action.get.MultiGetRequest;
@@ -501,6 +503,31 @@ public final void rankEvalAsync(RankEvalRequest rankEvalRequest, ActionListener<
headers);
}
+ /**
+ * Executes a request using the Field Capabilities API.
+ *
+ * See Field Capabilities API
+ * on elastic.co.
+ */
+ public final FieldCapabilitiesResponse fieldCaps(FieldCapabilitiesRequest fieldCapabilitiesRequest,
+ Header... headers) throws IOException {
+ return performRequestAndParseEntity(fieldCapabilitiesRequest, Request::fieldCaps,
+ FieldCapabilitiesResponse::fromXContent, emptySet(), headers);
+ }
+
+ /**
+ * Asynchronously executes a request using the Field Capabilities API.
+ *
+ * See Field Capabilities API
+ * on elastic.co.
+ */
+ public final void fieldCapsAsync(FieldCapabilitiesRequest fieldCapabilitiesRequest,
+ ActionListener listener,
+ Header... headers) {
+ performRequestAsyncAndParseEntity(fieldCapabilitiesRequest, Request::fieldCaps,
+ FieldCapabilitiesResponse::fromXContent, listener, emptySet(), headers);
+ }
+
protected final Resp performRequestAndParseEntity(Req request,
CheckedFunction requestConverter,
CheckedFunction entityParser,
diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/PingAndInfoIT.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/PingAndInfoIT.java
index c0de571226c4c..b4d8828eb7e6f 100644
--- a/client/rest-high-level/src/test/java/org/elasticsearch/client/PingAndInfoIT.java
+++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/PingAndInfoIT.java
@@ -42,6 +42,8 @@ public void testInfo() throws IOException {
// only check node name existence, might be a different one from what was hit by low level client in multi-node cluster
assertNotNull(info.getNodeName());
Map versionMap = (Map) infoAsMap.get("version");
+ assertEquals(versionMap.get("build_flavor"), info.getBuild().flavor().displayName());
+ assertEquals(versionMap.get("build_type"), info.getBuild().type().displayName());
assertEquals(versionMap.get("build_hash"), info.getBuild().shortHash());
assertEquals(versionMap.get("build_date"), info.getBuild().date());
assertEquals(versionMap.get("build_snapshot"), info.getBuild().isSnapshot());
diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/RequestTests.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/RequestTests.java
index abce180546dfc..0fdeb7555a04a 100644
--- a/client/rest-high-level/src/test/java/org/elasticsearch/client/RequestTests.java
+++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/RequestTests.java
@@ -52,6 +52,7 @@
import org.elasticsearch.action.bulk.BulkRequest;
import org.elasticsearch.action.bulk.BulkShardRequest;
import org.elasticsearch.action.delete.DeleteRequest;
+import org.elasticsearch.action.fieldcaps.FieldCapabilitiesRequest;
import org.elasticsearch.action.get.GetRequest;
import org.elasticsearch.action.get.MultiGetRequest;
import org.elasticsearch.action.index.IndexRequest;
@@ -89,6 +90,7 @@
import org.elasticsearch.index.rankeval.RankEvalSpec;
import org.elasticsearch.index.rankeval.RatedRequest;
import org.elasticsearch.index.rankeval.RestRankEvalAction;
+import org.elasticsearch.rest.action.RestFieldCapabilitiesAction;
import org.elasticsearch.rest.action.search.RestSearchAction;
import org.elasticsearch.search.Scroll;
import org.elasticsearch.search.aggregations.bucket.terms.TermsAggregationBuilder;
@@ -108,11 +110,14 @@
import java.lang.reflect.Constructor;
import java.lang.reflect.Modifier;
import java.util.ArrayList;
+import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
+import java.util.HashSet;
import java.util.List;
import java.util.Locale;
import java.util.Map;
+import java.util.Set;
import java.util.StringJoiner;
import java.util.function.Consumer;
import java.util.function.Function;
@@ -128,6 +133,8 @@
import static org.elasticsearch.search.RandomSearchRequestGenerator.randomSearchRequest;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertToXContentEquivalent;
import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.Matchers.hasEntry;
+import static org.hamcrest.Matchers.hasKey;
import static org.hamcrest.Matchers.nullValue;
public class RequestTests extends ESTestCase {
@@ -272,7 +279,6 @@ public void testIndicesExist() {
Map expectedParams = new HashMap<>();
setRandomIndicesOptions(getIndexRequest::indicesOptions, getIndexRequest::indicesOptions, expectedParams);
setRandomLocal(getIndexRequest, expectedParams);
- setRandomFlatSettings(getIndexRequest::flatSettings, expectedParams);
setRandomHumanReadable(getIndexRequest, expectedParams);
setRandomIncludeDefaults(getIndexRequest, expectedParams);
@@ -1214,6 +1220,47 @@ public void testExistsAliasNoAliasNoIndex() {
}
}
+ public void testFieldCaps() {
+ // Create a random request.
+ String[] indices = randomIndicesNames(0, 5);
+ String[] fields = generateRandomStringArray(5, 10, false, false);
+
+ FieldCapabilitiesRequest fieldCapabilitiesRequest = new FieldCapabilitiesRequest()
+ .indices(indices)
+ .fields(fields);
+
+ Map indicesOptionsParams = new HashMap<>();
+ setRandomIndicesOptions(fieldCapabilitiesRequest::indicesOptions,
+ fieldCapabilitiesRequest::indicesOptions,
+ indicesOptionsParams);
+
+ Request request = Request.fieldCaps(fieldCapabilitiesRequest);
+
+ // Verify that the resulting REST request looks as expected.
+ StringJoiner endpoint = new StringJoiner("/", "/", "");
+ String joinedIndices = String.join(",", indices);
+ if (!joinedIndices.isEmpty()) {
+ endpoint.add(joinedIndices);
+ }
+ endpoint.add("_field_caps");
+
+ assertEquals(endpoint.toString(), request.getEndpoint());
+ assertEquals(4, request.getParameters().size());
+
+ // Note that we don't check the field param value explicitly, as field names are passed through
+ // a hash set before being added to the request, and can appear in a non-deterministic order.
+ assertThat(request.getParameters(), hasKey("fields"));
+ String[] requestFields = Strings.splitStringByCommaToArray(request.getParameters().get("fields"));
+ assertEquals(new HashSet<>(Arrays.asList(fields)),
+ new HashSet<>(Arrays.asList(requestFields)));
+
+ for (Map.Entry param : indicesOptionsParams.entrySet()) {
+ assertThat(request.getParameters(), hasEntry(param.getKey(), param.getValue()));
+ }
+
+ assertNull(request.getEntity());
+ }
+
public void testRankEval() throws Exception {
RankEvalSpec spec = new RankEvalSpec(
Collections.singletonList(new RatedRequest("queryId", Collections.emptyList(), new SearchSourceBuilder())),
@@ -1234,7 +1281,6 @@ public void testRankEval() throws Exception {
assertEquals(3, request.getParameters().size());
assertEquals(expectedParams, request.getParameters());
assertToXContentBody(spec, request.getEntity());
-
}
public void testSplit() throws IOException {
@@ -1292,7 +1338,6 @@ private static void resizeTest(ResizeType resizeType, CheckedFunction expectedParams = new HashMap<>();
- setRandomFlatSettings(request::flatSettings, expectedParams);
setRandomMasterTimeout(request, expectedParams);
setRandomTimeout(request::timeout, AcknowledgedRequest.DEFAULT_ACK_TIMEOUT, expectedParams);
@@ -1344,7 +1389,6 @@ public void testIndexPutSettings() throws IOException {
String[] indices = randomBoolean() ? null : randomIndicesNames(0, 2);
UpdateSettingsRequest updateSettingsRequest = new UpdateSettingsRequest(indices);
Map expectedParams = new HashMap<>();
- setRandomFlatSettings(updateSettingsRequest::flatSettings, expectedParams);
setRandomMasterTimeout(updateSettingsRequest, expectedParams);
setRandomTimeout(updateSettingsRequest::timeout, AcknowledgedRequest.DEFAULT_ACK_TIMEOUT, expectedParams);
setRandomIndicesOptions(updateSettingsRequest::indicesOptions, updateSettingsRequest::indicesOptions, expectedParams);
@@ -1627,16 +1671,6 @@ private static void setRandomTimeout(Consumer setter, TimeValue defaultT
}
}
- private static void setRandomFlatSettings(Consumer setter, Map expectedParams) {
- if (randomBoolean()) {
- boolean flatSettings = randomBoolean();
- setter.accept(flatSettings);
- if (flatSettings) {
- expectedParams.put("flat_settings", String.valueOf(flatSettings));
- }
- }
- }
-
private static void setRandomMasterTimeout(MasterNodeRequest> request, Map expectedParams) {
if (randomBoolean()) {
String masterTimeout = randomTimeValue();
diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/SearchIT.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/SearchIT.java
index 01ef0598100fb..9828041332b32 100644
--- a/client/rest-high-level/src/test/java/org/elasticsearch/client/SearchIT.java
+++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/SearchIT.java
@@ -27,6 +27,9 @@
import org.apache.http.nio.entity.NStringEntity;
import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.ElasticsearchStatusException;
+import org.elasticsearch.action.fieldcaps.FieldCapabilities;
+import org.elasticsearch.action.fieldcaps.FieldCapabilitiesRequest;
+import org.elasticsearch.action.fieldcaps.FieldCapabilitiesResponse;
import org.elasticsearch.action.search.ClearScrollRequest;
import org.elasticsearch.action.search.ClearScrollResponse;
import org.elasticsearch.action.search.MultiSearchRequest;
@@ -96,14 +99,31 @@ public void indexDocuments() throws IOException {
client().performRequest(HttpPut.METHOD_NAME, "/index/type/5", Collections.emptyMap(), doc5);
client().performRequest(HttpPost.METHOD_NAME, "/index/_refresh");
- StringEntity doc = new StringEntity("{\"field\":\"value1\"}", ContentType.APPLICATION_JSON);
+
+ StringEntity doc = new StringEntity("{\"field\":\"value1\", \"rating\": 7}", ContentType.APPLICATION_JSON);
client().performRequest(HttpPut.METHOD_NAME, "/index1/doc/1", Collections.emptyMap(), doc);
doc = new StringEntity("{\"field\":\"value2\"}", ContentType.APPLICATION_JSON);
client().performRequest(HttpPut.METHOD_NAME, "/index1/doc/2", Collections.emptyMap(), doc);
- doc = new StringEntity("{\"field\":\"value1\"}", ContentType.APPLICATION_JSON);
+
+ StringEntity mappings = new StringEntity(
+ "{" +
+ " \"mappings\": {" +
+ " \"doc\": {" +
+ " \"properties\": {" +
+ " \"rating\": {" +
+ " \"type\": \"keyword\"" +
+ " }" +
+ " }" +
+ " }" +
+ " }" +
+ "}}",
+ ContentType.APPLICATION_JSON);
+ client().performRequest("PUT", "/index2", Collections.emptyMap(), mappings);
+ doc = new StringEntity("{\"field\":\"value1\", \"rating\": \"good\"}", ContentType.APPLICATION_JSON);
client().performRequest(HttpPut.METHOD_NAME, "/index2/doc/3", Collections.emptyMap(), doc);
doc = new StringEntity("{\"field\":\"value2\"}", ContentType.APPLICATION_JSON);
client().performRequest(HttpPut.METHOD_NAME, "/index2/doc/4", Collections.emptyMap(), doc);
+
doc = new StringEntity("{\"field\":\"value1\"}", ContentType.APPLICATION_JSON);
client().performRequest(HttpPut.METHOD_NAME, "/index3/doc/5", Collections.emptyMap(), doc);
doc = new StringEntity("{\"field\":\"value2\"}", ContentType.APPLICATION_JSON);
@@ -713,6 +733,57 @@ public void testMultiSearch_failure() throws Exception {
assertThat(multiSearchResponse.getResponses()[1].getResponse(), nullValue());
}
+ public void testFieldCaps() throws IOException {
+ FieldCapabilitiesRequest request = new FieldCapabilitiesRequest()
+ .indices("index1", "index2")
+ .fields("rating", "field");
+
+ FieldCapabilitiesResponse response = execute(request,
+ highLevelClient()::fieldCaps, highLevelClient()::fieldCapsAsync);
+
+ // Check the capabilities for the 'rating' field.
+ assertTrue(response.get().containsKey("rating"));
+ Map ratingResponse = response.getField("rating");
+ assertEquals(2, ratingResponse.size());
+
+ FieldCapabilities expectedKeywordCapabilities = new FieldCapabilities(
+ "rating", "keyword", true, true, new String[]{"index2"}, null, null);
+ assertEquals(expectedKeywordCapabilities, ratingResponse.get("keyword"));
+
+ FieldCapabilities expectedLongCapabilities = new FieldCapabilities(
+ "rating", "long", true, true, new String[]{"index1"}, null, null);
+ assertEquals(expectedLongCapabilities, ratingResponse.get("long"));
+
+ // Check the capabilities for the 'field' field.
+ assertTrue(response.get().containsKey("field"));
+ Map fieldResponse = response.getField("field");
+ assertEquals(1, fieldResponse.size());
+
+ FieldCapabilities expectedTextCapabilities = new FieldCapabilities(
+ "field", "text", true, false);
+ assertEquals(expectedTextCapabilities, fieldResponse.get("text"));
+ }
+
+ public void testFieldCapsWithNonExistentFields() throws IOException {
+ FieldCapabilitiesRequest request = new FieldCapabilitiesRequest()
+ .indices("index2")
+ .fields("nonexistent");
+
+ FieldCapabilitiesResponse response = execute(request,
+ highLevelClient()::fieldCaps, highLevelClient()::fieldCapsAsync);
+ assertTrue(response.get().isEmpty());
+ }
+
+ public void testFieldCapsWithNonExistentIndices() {
+ FieldCapabilitiesRequest request = new FieldCapabilitiesRequest()
+ .indices("non-existent")
+ .fields("rating");
+
+ ElasticsearchException exception = expectThrows(ElasticsearchException.class,
+ () -> execute(request, highLevelClient()::fieldCaps, highLevelClient()::fieldCapsAsync));
+ assertEquals(RestStatus.NOT_FOUND, exception.status());
+ }
+
private static void assertSearchHeader(SearchResponse searchResponse) {
assertThat(searchResponse.getTook().nanos(), greaterThanOrEqualTo(0L));
assertEquals(0, searchResponse.getFailedShards());
diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/ClusterClientDocumentationIT.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/ClusterClientDocumentationIT.java
index 0747ca757c4b9..2e7ea1650f424 100644
--- a/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/ClusterClientDocumentationIT.java
+++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/ClusterClientDocumentationIT.java
@@ -124,10 +124,6 @@ public void testClusterPutSettings() throws IOException {
request.masterNodeTimeout("1m"); // <2>
// end::put-settings-request-masterTimeout
- // tag::put-settings-request-flat-settings
- request.flatSettings(true); // <1>
- // end::put-settings-request-flat-settings
-
// tag::put-settings-execute
ClusterUpdateSettingsResponse response = client.cluster().putSettings(request);
// end::put-settings-execute
diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/IndicesClientDocumentationIT.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/IndicesClientDocumentationIT.java
index e33d1e4729b0e..24c321f87f998 100644
--- a/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/IndicesClientDocumentationIT.java
+++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/IndicesClientDocumentationIT.java
@@ -58,7 +58,6 @@
import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.client.ESRestHighLevelClientTestCase;
import org.elasticsearch.client.RestHighLevelClient;
-import org.elasticsearch.cluster.metadata.IndexMetaData;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.unit.ByteSizeUnit;
import org.elasticsearch.common.unit.ByteSizeValue;
@@ -114,8 +113,7 @@ public void testIndicesExist() throws IOException {
request.local(false); // <1>
request.humanReadable(true); // <2>
request.includeDefaults(false); // <3>
- request.flatSettings(false); // <4>
- request.indicesOptions(indicesOptions); // <5>
+ request.indicesOptions(indicesOptions); // <4>
// end::indices-exists-request-optionals
// tag::indices-exists-response
@@ -1433,9 +1431,6 @@ public void testIndexPutSettings() throws Exception {
// end::put-settings-settings-source
}
- // tag::put-settings-request-flat-settings
- request.flatSettings(true); // <1>
- // end::put-settings-request-flat-settings
// tag::put-settings-request-preserveExisting
request.setPreserveExisting(false); // <1>
// end::put-settings-request-preserveExisting
diff --git a/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/SearchDocumentationIT.java b/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/SearchDocumentationIT.java
index bd1cf48f14195..4400d05a9f820 100644
--- a/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/SearchDocumentationIT.java
+++ b/client/rest-high-level/src/test/java/org/elasticsearch/client/documentation/SearchDocumentationIT.java
@@ -21,8 +21,13 @@
import org.elasticsearch.action.ActionListener;
import org.elasticsearch.action.LatchedActionListener;
+import org.elasticsearch.action.admin.indices.create.CreateIndexRequest;
+import org.elasticsearch.action.admin.indices.create.CreateIndexResponse;
import org.elasticsearch.action.bulk.BulkRequest;
import org.elasticsearch.action.bulk.BulkResponse;
+import org.elasticsearch.action.fieldcaps.FieldCapabilities;
+import org.elasticsearch.action.fieldcaps.FieldCapabilitiesRequest;
+import org.elasticsearch.action.fieldcaps.FieldCapabilitiesResponse;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.action.search.ClearScrollRequest;
@@ -44,6 +49,16 @@
import org.elasticsearch.index.query.MatchQueryBuilder;
import org.elasticsearch.index.query.QueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
+import org.elasticsearch.index.rankeval.EvalQueryQuality;
+import org.elasticsearch.index.rankeval.EvaluationMetric;
+import org.elasticsearch.index.rankeval.MetricDetail;
+import org.elasticsearch.index.rankeval.PrecisionAtK;
+import org.elasticsearch.index.rankeval.RankEvalRequest;
+import org.elasticsearch.index.rankeval.RankEvalResponse;
+import org.elasticsearch.index.rankeval.RankEvalSpec;
+import org.elasticsearch.index.rankeval.RatedDocument;
+import org.elasticsearch.index.rankeval.RatedRequest;
+import org.elasticsearch.index.rankeval.RatedSearchHit;
import org.elasticsearch.rest.RestStatus;
import org.elasticsearch.search.Scroll;
import org.elasticsearch.search.SearchHit;
@@ -74,6 +89,7 @@
import org.elasticsearch.search.suggest.term.TermSuggestion;
import java.io.IOException;
+import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
@@ -82,6 +98,8 @@
import java.util.concurrent.TimeUnit;
import static org.elasticsearch.index.query.QueryBuilders.matchQuery;
+import static org.hamcrest.Matchers.contains;
+import static org.hamcrest.Matchers.containsInAnyOrder;
import static org.hamcrest.Matchers.containsString;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.greaterThan;
@@ -146,6 +164,7 @@ public void testSearch() throws Exception {
// tag::search-source-setter
SearchRequest searchRequest = new SearchRequest();
+ searchRequest.indices("posts");
searchRequest.source(sourceBuilder);
// end::search-source-setter
@@ -688,6 +707,136 @@ public void onFailure(Exception e) {
}
}
+ public void testFieldCaps() throws Exception {
+ indexSearchTestData();
+ RestHighLevelClient client = highLevelClient();
+ // tag::field-caps-request
+ FieldCapabilitiesRequest request = new FieldCapabilitiesRequest()
+ .fields("user")
+ .indices("posts", "authors", "contributors");
+ // end::field-caps-request
+
+ // tag::field-caps-request-indicesOptions
+ request.indicesOptions(IndicesOptions.lenientExpandOpen()); // <1>
+ // end::field-caps-request-indicesOptions
+
+ // tag::field-caps-execute
+ FieldCapabilitiesResponse response = client.fieldCaps(request);
+ // end::field-caps-execute
+
+ // tag::field-caps-response
+ assertThat(response.get().keySet(), contains("user"));
+ Map userResponse = response.getField("user");
+
+ assertThat(userResponse.keySet(), containsInAnyOrder("keyword", "text")); // <1>
+ FieldCapabilities textCapabilities = userResponse.get("keyword");
+
+ assertTrue(textCapabilities.isSearchable());
+ assertFalse(textCapabilities.isAggregatable());
+
+ assertArrayEquals(textCapabilities.indices(), // <2>
+ new String[]{"authors", "contributors"});
+ assertNull(textCapabilities.nonSearchableIndices()); // <3>
+ assertArrayEquals(textCapabilities.nonAggregatableIndices(), // <4>
+ new String[]{"authors"});
+ // end::field-caps-response
+
+ // tag::field-caps-execute-listener
+ ActionListener listener = new ActionListener() {
+ @Override
+ public void onResponse(FieldCapabilitiesResponse response) {
+ // <1>
+ }
+
+ @Override
+ public void onFailure(Exception e) {
+ // <2>
+ }
+ };
+ // end::field-caps-execute-listener
+
+ // Replace the empty listener by a blocking listener for tests.
+ CountDownLatch latch = new CountDownLatch(1);
+ listener = new LatchedActionListener<>(listener, latch);
+
+ // tag::field-caps-execute-async
+ client.fieldCapsAsync(request, listener); // <1>
+ // end::field-caps-execute-async
+
+ assertTrue(latch.await(30L, TimeUnit.SECONDS));
+ }
+
+ public void testRankEval() throws Exception {
+ indexSearchTestData();
+ RestHighLevelClient client = highLevelClient();
+ {
+ // tag::rank-eval-request-basic
+ EvaluationMetric metric = new PrecisionAtK(); // <1>
+ List ratedDocs = new ArrayList<>();
+ ratedDocs.add(new RatedDocument("posts", "1", 1)); // <2>
+ SearchSourceBuilder searchQuery = new SearchSourceBuilder();
+ searchQuery.query(QueryBuilders.matchQuery("user", "kimchy"));// <3>
+ RatedRequest ratedRequest = // <4>
+ new RatedRequest("kimchy_query", ratedDocs, searchQuery);
+ List ratedRequests = Arrays.asList(ratedRequest);
+ RankEvalSpec specification =
+ new RankEvalSpec(ratedRequests, metric); // <5>
+ RankEvalRequest request = // <6>
+ new RankEvalRequest(specification, new String[] { "posts" });
+ // end::rank-eval-request-basic
+
+ // tag::rank-eval-execute
+ RankEvalResponse response = client.rankEval(request);
+ // end::rank-eval-execute
+
+ // tag::rank-eval-response
+ double evaluationResult = response.getEvaluationResult(); // <1>
+ assertEquals(1.0 / 3.0, evaluationResult, 0.0);
+ Map partialResults =
+ response.getPartialResults();
+ EvalQueryQuality evalQuality =
+ partialResults.get("kimchy_query"); // <2>
+ assertEquals("kimchy_query", evalQuality.getId());
+ double qualityLevel = evalQuality.getQualityLevel(); // <3>
+ assertEquals(1.0 / 3.0, qualityLevel, 0.0);
+ List hitsAndRatings = evalQuality.getHitsAndRatings();
+ RatedSearchHit ratedSearchHit = hitsAndRatings.get(0);
+ assertEquals("3", ratedSearchHit.getSearchHit().getId()); // <4>
+ assertFalse(ratedSearchHit.getRating().isPresent()); // <5>
+ MetricDetail metricDetails = evalQuality.getMetricDetails();
+ String metricName = metricDetails.getMetricName();
+ assertEquals(PrecisionAtK.NAME, metricName); // <6>
+ PrecisionAtK.Detail detail = (PrecisionAtK.Detail) metricDetails;
+ assertEquals(1, detail.getRelevantRetrieved()); // <7>
+ assertEquals(3, detail.getRetrieved());
+ // end::rank-eval-response
+
+ // tag::rank-eval-execute-listener
+ ActionListener listener = new ActionListener() {
+ @Override
+ public void onResponse(RankEvalResponse response) {
+ // <1>
+ }
+
+ @Override
+ public void onFailure(Exception e) {
+ // <2>
+ }
+ };
+ // end::rank-eval-execute-listener
+
+ // Replace the empty listener by a blocking listener in test
+ final CountDownLatch latch = new CountDownLatch(1);
+ listener = new LatchedActionListener<>(listener, latch);
+
+ // tag::rank-eval-execute-async
+ client.rankEvalAsync(request, listener); // <1>
+ // end::rank-eval-execute-async
+
+ assertTrue(latch.await(30L, TimeUnit.SECONDS));
+ }
+ }
+
public void testMultiSearch() throws Exception {
indexSearchTestData();
RestHighLevelClient client = highLevelClient();
@@ -712,7 +861,7 @@ public void testMultiSearch() throws Exception {
MultiSearchResponse.Item firstResponse = response.getResponses()[0]; // <1>
assertNull(firstResponse.getFailure()); // <2>
SearchResponse searchResponse = firstResponse.getResponse(); // <3>
- assertEquals(3, searchResponse.getHits().getTotalHits());
+ assertEquals(4, searchResponse.getHits().getTotalHits());
MultiSearchResponse.Item secondResponse = response.getResponses()[1]; // <4>
assertNull(secondResponse.getFailure());
searchResponse = secondResponse.getResponse();
@@ -758,18 +907,35 @@ public void onFailure(Exception e) {
}
private void indexSearchTestData() throws IOException {
- BulkRequest request = new BulkRequest();
- request.add(new IndexRequest("posts", "doc", "1")
+ CreateIndexRequest authorsRequest = new CreateIndexRequest("authors")
+ .mapping("doc", "user", "type=keyword,doc_values=false");
+ CreateIndexResponse authorsResponse = highLevelClient().indices().create(authorsRequest);
+ assertTrue(authorsResponse.isAcknowledged());
+
+ CreateIndexRequest reviewersRequest = new CreateIndexRequest("contributors")
+ .mapping("doc", "user", "type=keyword");
+ CreateIndexResponse reviewersResponse = highLevelClient().indices().create(reviewersRequest);
+ assertTrue(reviewersResponse.isAcknowledged());
+
+ BulkRequest bulkRequest = new BulkRequest();
+ bulkRequest.add(new IndexRequest("posts", "doc", "1")
.source(XContentType.JSON, "title", "In which order are my Elasticsearch queries executed?", "user",
Arrays.asList("kimchy", "luca"), "innerObject", Collections.singletonMap("key", "value")));
- request.add(new IndexRequest("posts", "doc", "2")
+ bulkRequest.add(new IndexRequest("posts", "doc", "2")
.source(XContentType.JSON, "title", "Current status and upcoming changes in Elasticsearch", "user",
Arrays.asList("kimchy", "christoph"), "innerObject", Collections.singletonMap("key", "value")));
- request.add(new IndexRequest("posts", "doc", "3")
+ bulkRequest.add(new IndexRequest("posts", "doc", "3")
.source(XContentType.JSON, "title", "The Future of Federated Search in Elasticsearch", "user",
Arrays.asList("kimchy", "tanguy"), "innerObject", Collections.singletonMap("key", "value")));
- request.setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE);
- BulkResponse bulkResponse = highLevelClient().bulk(request);
+
+ bulkRequest.add(new IndexRequest("authors", "doc", "1")
+ .source(XContentType.JSON, "user", "kimchy"));
+ bulkRequest.add(new IndexRequest("contributors", "doc", "1")
+ .source(XContentType.JSON, "user", "tanguy"));
+
+
+ bulkRequest.setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE);
+ BulkResponse bulkResponse = highLevelClient().bulk(bulkRequest);
assertSame(RestStatus.OK, bulkResponse.status());
assertFalse(bulkResponse.hasFailures());
}
diff --git a/client/rest/build.gradle b/client/rest/build.gradle
index 8e0f179634a27..bcb928495c5d2 100644
--- a/client/rest/build.gradle
+++ b/client/rest/build.gradle
@@ -20,7 +20,6 @@
import org.elasticsearch.gradle.precommit.PrecommitTasks
apply plugin: 'elasticsearch.build'
-apply plugin: 'ru.vyarus.animalsniffer'
apply plugin: 'nebula.maven-base-publish'
apply plugin: 'nebula.maven-scm'
@@ -52,8 +51,6 @@ dependencies {
testCompile "org.hamcrest:hamcrest-all:${versions.hamcrest}"
testCompile "org.elasticsearch:securemock:${versions.securemock}"
testCompile "org.elasticsearch:mocksocket:${versions.mocksocket}"
- testCompile "org.codehaus.mojo:animal-sniffer-annotations:1.15"
- signature "org.codehaus.mojo.signature:java17:1.0@signature"
}
forbiddenApisMain {
diff --git a/client/rest/src/test/java/org/elasticsearch/client/RestClientBuilderIntegTests.java b/client/rest/src/test/java/org/elasticsearch/client/RestClientBuilderIntegTests.java
index 8142fea6d259b..199b7542e62a2 100644
--- a/client/rest/src/test/java/org/elasticsearch/client/RestClientBuilderIntegTests.java
+++ b/client/rest/src/test/java/org/elasticsearch/client/RestClientBuilderIntegTests.java
@@ -24,7 +24,6 @@
import com.sun.net.httpserver.HttpsConfigurator;
import com.sun.net.httpserver.HttpsServer;
import org.apache.http.HttpHost;
-import org.codehaus.mojo.animal_sniffer.IgnoreJRERequirement;
import org.elasticsearch.mocksocket.MockHttpServer;
import org.junit.AfterClass;
import org.junit.BeforeClass;
@@ -46,8 +45,6 @@
/**
* Integration test to validate the builder builds a client with the correct configuration
*/
-//animal-sniffer doesn't like our usage of com.sun.net.httpserver.* classes
-@IgnoreJRERequirement
public class RestClientBuilderIntegTests extends RestClientTestCase {
private static HttpsServer httpsServer;
@@ -60,8 +57,6 @@ public static void startHttpServer() throws Exception {
httpsServer.start();
}
- //animal-sniffer doesn't like our usage of com.sun.net.httpserver.* classes
- @IgnoreJRERequirement
private static class ResponseHandler implements HttpHandler {
@Override
public void handle(HttpExchange httpExchange) throws IOException {
diff --git a/client/rest/src/test/java/org/elasticsearch/client/RestClientMultipleHostsIntegTests.java b/client/rest/src/test/java/org/elasticsearch/client/RestClientMultipleHostsIntegTests.java
index da5a960c2e84c..16c192b3977a8 100644
--- a/client/rest/src/test/java/org/elasticsearch/client/RestClientMultipleHostsIntegTests.java
+++ b/client/rest/src/test/java/org/elasticsearch/client/RestClientMultipleHostsIntegTests.java
@@ -23,7 +23,6 @@
import com.sun.net.httpserver.HttpHandler;
import com.sun.net.httpserver.HttpServer;
import org.apache.http.HttpHost;
-import org.codehaus.mojo.animal_sniffer.IgnoreJRERequirement;
import org.elasticsearch.mocksocket.MockHttpServer;
import org.junit.AfterClass;
import org.junit.Before;
@@ -48,8 +47,6 @@
* Integration test to check interaction between {@link RestClient} and {@link org.apache.http.client.HttpClient}.
* Works against real http servers, multiple hosts. Also tests failover by randomly shutting down hosts.
*/
-//animal-sniffer doesn't like our usage of com.sun.net.httpserver.* classes
-@IgnoreJRERequirement
public class RestClientMultipleHostsIntegTests extends RestClientTestCase {
private static HttpServer[] httpServers;
@@ -90,8 +87,6 @@ private static HttpServer createHttpServer() throws Exception {
return httpServer;
}
- //animal-sniffer doesn't like our usage of com.sun.net.httpserver.* classes
- @IgnoreJRERequirement
private static class ResponseHandler implements HttpHandler {
private final int statusCode;
diff --git a/client/rest/src/test/java/org/elasticsearch/client/RestClientSingleHostIntegTests.java b/client/rest/src/test/java/org/elasticsearch/client/RestClientSingleHostIntegTests.java
index 59aa2baab9672..dd23dbe454fa4 100644
--- a/client/rest/src/test/java/org/elasticsearch/client/RestClientSingleHostIntegTests.java
+++ b/client/rest/src/test/java/org/elasticsearch/client/RestClientSingleHostIntegTests.java
@@ -33,7 +33,6 @@
import org.apache.http.impl.client.BasicCredentialsProvider;
import org.apache.http.impl.nio.client.HttpAsyncClientBuilder;
import org.apache.http.util.EntityUtils;
-import org.codehaus.mojo.animal_sniffer.IgnoreJRERequirement;
import org.elasticsearch.mocksocket.MockHttpServer;
import org.junit.AfterClass;
import org.junit.BeforeClass;
@@ -64,8 +63,6 @@
* Integration test to check interaction between {@link RestClient} and {@link org.apache.http.client.HttpClient}.
* Works against a real http server, one single host.
*/
-//animal-sniffer doesn't like our usage of com.sun.net.httpserver.* classes
-@IgnoreJRERequirement
public class RestClientSingleHostIntegTests extends RestClientTestCase {
private static HttpServer httpServer;
@@ -91,8 +88,6 @@ private static HttpServer createHttpServer() throws Exception {
return httpServer;
}
- //animal-sniffer doesn't like our usage of com.sun.net.httpserver.* classes
- @IgnoreJRERequirement
private static class ResponseHandler implements HttpHandler {
private final int statusCode;
diff --git a/distribution/archives/build.gradle b/distribution/archives/build.gradle
index bb59bc84f5385..f2fc297a9e4c8 100644
--- a/distribution/archives/build.gradle
+++ b/distribution/archives/build.gradle
@@ -23,8 +23,12 @@ import org.elasticsearch.gradle.BuildPlugin
import org.elasticsearch.gradle.EmptyDirTask
import org.elasticsearch.gradle.LoggedExec
import org.elasticsearch.gradle.MavenFilteringHack
+import org.elasticsearch.gradle.VersionProperties
import org.elasticsearch.gradle.plugin.PluginBuildPlugin
+import java.nio.file.Files
+import java.nio.file.Path
+
// need this so Zip/Tar tasks get basic defaults...
apply plugin: 'base'
@@ -42,23 +46,23 @@ task createPluginsDir(type: EmptyDirTask) {
dirMode 0755
}
-CopySpec archiveFiles(CopySpec... innerFiles) {
+CopySpec archiveFiles(CopySpec modulesFiles, String distributionType, boolean oss) {
return copySpec {
into("elasticsearch-${version}") {
with libFiles
into('config') {
dirMode 0750
fileMode 0660
- with configFiles('def')
+ with configFiles(distributionType, oss)
}
into('bin') {
+ with binFiles(distributionType, oss)
with copySpec {
- with binFiles('def')
from('../src/bin') {
include '*.bat'
filter(FixCrLfFilter, eol: FixCrLfFilter.CrLf.newInstance('crlf'))
}
- MavenFilteringHack.filter(it, expansionsForDistribution('def'))
+ MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, oss))
}
}
into('') {
@@ -73,43 +77,65 @@ CopySpec archiveFiles(CopySpec... innerFiles) {
pluginsDir.getParent()
}
}
- with commonFiles
+ from(rootProject.projectDir) {
+ include 'README.textile'
+ }
+ from(rootProject.file('licenses')) {
+ include oss ? 'APACHE-LICENSE-2.0.txt' : 'ELASTIC-LICENSE.txt'
+ rename { 'LICENSE.txt' }
+ }
+
with noticeFile
from('../src') {
include 'bin/*.exe'
}
- for (CopySpec files : innerFiles) {
- with files
+ into('modules') {
+ with modulesFiles
}
}
}
}
-task buildIntegTestZip(type: Zip) {
+// common config across all zip/tar
+tasks.withType(AbstractArchiveTask) {
dependsOn createLogsDir, createPluginsDir
- destinationDir = file('integ-test-zip/build/distributions')
- baseName = 'elasticsearch'
- with archiveFiles(transportModulesFiles)
+ String subdir = it.name.substring('build'.size()).replaceAll(/[A-Z]/) { '-' + it.toLowerCase() }.substring(1)
+ destinationDir = file("${subdir}/build/distributions")
+ baseName = "elasticsearch${ subdir.contains('oss') ? '-oss' : ''}"
+}
+
+task buildIntegTestZip(type: Zip) {
+ with archiveFiles(transportModulesFiles, 'zip', false)
}
task buildZip(type: Zip) {
- dependsOn createLogsDir, createPluginsDir
- destinationDir = file('zip/build/distributions')
- baseName = 'elasticsearch'
- with archiveFiles(modulesFiles)
+ with archiveFiles(modulesFiles(false), 'zip', false)
}
-task buildTar(type: Tar) {
- dependsOn createLogsDir, createPluginsDir
- destinationDir = file('tar/build/distributions')
- baseName = 'elasticsearch'
+task buildOssZip(type: Zip) {
+ with archiveFiles(modulesFiles(true), 'zip', true)
+}
+
+Closure commonTarConfig = {
extension = 'tar.gz'
compression = Compression.GZIP
dirMode 0755
fileMode 0644
- with archiveFiles(modulesFiles)
}
+task buildTar(type: Tar) {
+ configure(commonTarConfig)
+ with archiveFiles(modulesFiles(false), 'tar', false)
+}
+
+task buildOssTar(type: Tar) {
+ configure(commonTarConfig)
+ with archiveFiles(modulesFiles(true), 'tar', true)
+}
+
+Closure tarExists = { it -> new File('/bin/tar').exists() || new File('/usr/bin/tar').exists() || new File('/usr/local/bin/tar').exists() }
+Closure unzipExists = { it -> new File('/bin/unzip').exists() || new File('/usr/bin/unzip').exists() || new File('/usr/local/bin/unzip').exists() }
+
// This configures the default artifact for the distribution specific
// subprojects. We have subprojects for two reasons:
// 1. Gradle project substitutions can only bind to the default
@@ -119,35 +145,78 @@ task buildTar(type: Tar) {
subprojects {
apply plugin: 'distribution'
- archivesBaseName = 'elasticsearch'
-
String buildTask = "build${it.name.replaceAll(/-[a-z]/) { it.substring(1).toUpperCase() }.capitalize()}"
ext.buildDist = parent.tasks.getByName(buildTask)
artifacts {
'default' buildDist
}
- // sanity checks if a archives can be extracted
- File extractionDir = new File(buildDir, 'extracted')
- task testExtraction(type: LoggedExec) {
+ // sanity checks if archives can be extracted
+ final File archiveExtractionDir
+ if (project.name.contains('tar')) {
+ archiveExtractionDir = new File(buildDir, 'tar-extracted')
+ } else {
+ assert project.name.contains('zip')
+ archiveExtractionDir = new File(buildDir, 'zip-extracted')
+ }
+ task checkExtraction(type: LoggedExec) {
dependsOn buildDist
doFirst {
- project.delete(extractionDir)
- extractionDir.mkdirs()
+ project.delete(archiveExtractionDir)
+ archiveExtractionDir.mkdirs()
}
}
- if (project.name.contains('zip')) {
- testExtraction {
- onlyIf { new File('/bin/unzip').exists() || new File('/usr/bin/unzip').exists() || new File('/usr/local/bin/unzip').exists() }
- commandLine 'unzip', "${-> buildDist.outputs.files.singleFile}", '-d', extractionDir
+ check.dependsOn checkExtraction
+ if (project.name.contains('tar')) {
+ checkExtraction {
+ onlyIf tarExists
+ commandLine 'tar', '-xvzf', "${-> buildDist.outputs.files.singleFile}", '-C', archiveExtractionDir
}
- } else { // tar
- testExtraction {
- onlyIf { new File('/bin/tar').exists() || new File('/usr/bin/tar').exists() || new File('/usr/local/bin/tar').exists() }
- commandLine 'tar', '-xvzf', "${-> buildDist.outputs.files.singleFile}", '-C', extractionDir
+ } else {
+ assert project.name.contains('zip')
+ checkExtraction {
+ onlyIf unzipExists
+ commandLine 'unzip', "${-> buildDist.outputs.files.singleFile}", '-d', archiveExtractionDir
}
}
- check.dependsOn testExtraction
+
+ final Closure toolExists
+ if (project.name.contains('tar')) {
+ toolExists = tarExists
+ } else {
+ assert project.name.contains('zip')
+ toolExists = unzipExists
+ }
+
+
+ task checkLicense {
+ dependsOn buildDist, checkExtraction
+ onlyIf toolExists
+ doLast {
+ final String licenseFilename
+ if (project.name.contains('oss-')) {
+ licenseFilename = "APACHE-LICENSE-2.0.txt"
+ } else {
+ licenseFilename = "ELASTIC-LICENSE.txt"
+ }
+ final List licenseLines = Files.readAllLines(rootDir.toPath().resolve("licenses/" + licenseFilename))
+ final Path licensePath = archiveExtractionDir.toPath().resolve("elasticsearch-${VersionProperties.elasticsearch}/LICENSE.txt")
+ assertLinesInFile(licensePath, licenseLines)
+ }
+ }
+ check.dependsOn checkLicense
+
+ task checkNotice {
+ dependsOn buildDist, checkExtraction
+ onlyIf toolExists
+ doLast {
+ final List noticeLines = Arrays.asList("Elasticsearch", "Copyright 2009-2018 Elasticsearch")
+ final Path noticePath = archiveExtractionDir.toPath().resolve("elasticsearch-${VersionProperties.elasticsearch}/NOTICE.txt")
+ assertLinesInFile(noticePath, noticeLines)
+ }
+ }
+ check.dependsOn checkNotice
+
}
/*****************************************************************************
@@ -158,7 +227,7 @@ configure(subprojects.findAll { it.name == 'integ-test-zip' }) {
apply plugin: 'elasticsearch.rest-test'
integTest {
- includePackaged true
+ includePackaged = true
}
integTestCluster {
@@ -190,12 +259,16 @@ configure(subprojects.findAll { it.name.contains('zip') }) {
// note: the group must be correct before applying the nexus plugin, or
// it will capture the wrong value...
- project.group = "org.elasticsearch.distribution.${project.name}"
+ String subgroup = project.name == 'integ-test-zip' ? 'integ-test-zip' : 'zip'
+ project.group = "org.elasticsearch.distribution.${subgroup}"
+
+ // make the pom file name use elasticsearch instead of the project name
+ archivesBaseName = "elasticsearch${it.name.contains('oss') ? '-oss' : ''}"
publishing {
publications {
nebula {
- artifactId 'elasticsearch'
+ artifactId archivesBaseName
artifact buildDist
}
/*
@@ -215,7 +288,7 @@ configure(subprojects.findAll { it.name.contains('zip') }) {
* local work, since we publish to maven central externally.
*/
nebulaRealPom(MavenPublication) {
- artifactId 'elasticsearch'
+ artifactId archivesBaseName
pom.packaging = 'pom'
pom.withXml { XmlProvider xml ->
Node root = xml.asNode()
@@ -229,4 +302,3 @@ configure(subprojects.findAll { it.name.contains('zip') }) {
}
}
}
-
diff --git a/distribution/archives/oss-tar/build.gradle b/distribution/archives/oss-tar/build.gradle
new file mode 100644
index 0000000000000..4a6dde5fc0c92
--- /dev/null
+++ b/distribution/archives/oss-tar/build.gradle
@@ -0,0 +1,2 @@
+// This file is intentionally blank. All configuration of the
+// distribution is done in the parent project.
diff --git a/distribution/archives/oss-zip/build.gradle b/distribution/archives/oss-zip/build.gradle
new file mode 100644
index 0000000000000..4a6dde5fc0c92
--- /dev/null
+++ b/distribution/archives/oss-zip/build.gradle
@@ -0,0 +1,2 @@
+// This file is intentionally blank. All configuration of the
+// distribution is done in the parent project.
diff --git a/distribution/build.gradle b/distribution/build.gradle
index 20758deb918c0..c1ab5b76148b3 100644
--- a/distribution/build.gradle
+++ b/distribution/build.gradle
@@ -17,17 +17,13 @@
* under the License.
*/
-
-import org.apache.tools.ant.filters.FixCrLfFilter
-import org.apache.tools.ant.taskdefs.condition.Os
-import org.elasticsearch.gradle.BuildPlugin
import org.elasticsearch.gradle.ConcatFilesTask
import org.elasticsearch.gradle.MavenFilteringHack
import org.elasticsearch.gradle.NoticeTask
-import org.elasticsearch.gradle.precommit.DependencyLicensesTask
-import org.elasticsearch.gradle.precommit.UpdateShasTask
import org.elasticsearch.gradle.test.RunTask
+import java.nio.file.Path
+
Collection distributions = project('archives').subprojects + project('packages').subprojects
/*****************************************************************************
@@ -46,42 +42,156 @@ task generateDependenciesReport(type: ConcatFilesTask) {
*****************************************************************************/
// integ test zip only uses server, so a different notice file is needed there
-task buildCoreNotice(type: NoticeTask) {
+task buildServerNotice(type: NoticeTask) {
licensesDir new File(project(':server').projectDir, 'licenses')
}
// other distributions include notices from modules as well, which are added below later
-task buildFullNotice(type: NoticeTask) {
+task buildDefaultNotice(type: NoticeTask) {
+ licensesDir new File(project(':server').projectDir, 'licenses')
+}
+
+// other distributions include notices from modules as well, which are added below later
+task buildOssNotice(type: NoticeTask) {
licensesDir new File(project(':server').projectDir, 'licenses')
}
/*****************************************************************************
* Modules *
*****************************************************************************/
+String ossOutputs = 'build/outputs/oss'
+String defaultOutputs = 'build/outputs/default'
+String transportOutputs = 'build/outputs/transport-only'
+
+task processOssOutputs(type: Sync) {
+ into ossOutputs
+}
+
+task processDefaultOutputs(type: Sync) {
+ into defaultOutputs
+ from processOssOutputs
+}
+
+// Integ tests work over the rest http layer, so we need a transport included with the integ test zip.
+// All transport modules are included so that they may be randomized for testing
+task processTransportOutputs(type: Sync) {
+ into transportOutputs
+}
+
+// these are dummy tasks that can be used to depend on the relevant sub output dir
+task buildOssModules {
+ dependsOn processOssOutputs
+ outputs.dir "${ossOutputs}/modules"
+}
+task buildOssBin {
+ dependsOn processOssOutputs
+ outputs.dir "${ossOutputs}/bin"
+}
+task buildOssConfig {
+ dependsOn processOssOutputs
+ outputs.dir "${ossOutputs}/config"
+}
+task buildDefaultModules {
+ dependsOn processDefaultOutputs
+ outputs.dir "${defaultOutputs}/modules"
+}
+task buildDefaultBin {
+ dependsOn processDefaultOutputs
+ outputs.dir "${defaultOutputs}/bin"
+}
+task buildDefaultConfig {
+ dependsOn processDefaultOutputs
+ outputs.dir "${defaultOutputs}/config"
+}
+task buildTransportModules {
+ dependsOn processTransportOutputs
+ outputs.dir "${transportOutputs}/modules"
+}
-task buildModules(type: Sync) {
- into 'build/modules'
+void copyModule(Sync copyTask, Project module) {
+ copyTask.configure {
+ dependsOn { module.bundlePlugin }
+ from({ zipTree(module.bundlePlugin.outputs.files.singleFile) }) {
+ includeEmptyDirs false
+
+ // these are handled separately in the log4j config tasks below
+ exclude '*/config/log4j2.properties'
+ exclude 'config/log4j2.properties'
+
+ eachFile { details ->
+ String name = module.plugins.hasPlugin('elasticsearch.esplugin') ? module.esplugin.name : module.es_meta_plugin.name
+ // Copy all non config/bin files
+ // Note these might be unde a subdirectory in the case of a meta plugin
+ if ((details.relativePath.pathString ==~ /([^\/]+\/)?(config|bin)\/.*/) == false) {
+ details.relativePath = details.relativePath.prepend('modules', name)
+ } else if ((details.relativePath.pathString ==~ /([^\/]+\/)(config|bin)\/.*/)) {
+ // this is the meta plugin case, in which we need to remove the intermediate dir
+ String[] segments = details.relativePath.segments
+ details.relativePath = new RelativePath(true, segments.takeRight(segments.length - 1))
+ }
+ }
+ }
+ }
+}
+
+// log4j config could be contained in modules, so we must join it together using these tasks
+task buildOssLog4jConfig {
+ dependsOn processOssOutputs
+ ext.contents = []
+ ext.log4jFile = file("${ossOutputs}/log4j2.properties")
+ outputs.file log4jFile
+}
+task buildDefaultLog4jConfig {
+ dependsOn processDefaultOutputs
+ ext.contents = []
+ ext.log4jFile = file("${defaultOutputs}/log4j2.properties")
+ outputs.file log4jFile
+}
+
+Closure writeLog4jProperties = {
+ String mainLog4jProperties = file('src/config/log4j2.properties').getText('UTF-8')
+ it.log4jFile.setText(mainLog4jProperties, 'UTF-8')
+ for (String moduleLog4jProperties : it.contents.reverse()) {
+ it.log4jFile.append(moduleLog4jProperties, 'UTF-8')
+ }
+}
+buildOssLog4jConfig.doLast(writeLog4jProperties)
+buildDefaultLog4jConfig.doLast(writeLog4jProperties)
+
+// copy log4j2.properties from modules that have it
+void copyLog4jProperties(Task buildTask, Project module) {
+ buildTask.doFirst {
+ FileTree tree = zipTree(module.bundlePlugin.outputs.files.singleFile)
+ FileTree filtered = tree.matching {
+ include 'config/log4j2.properties'
+ include '*/config/log4j2.properties' // could be in a bundled plugin
+ }
+ if (filtered.isEmpty() == false) {
+ buildTask.contents.add('\n\n' + filtered.singleFile.getText('UTF-8'))
+ }
+ }
}
ext.restTestExpansions = [
'expected.modules.count': 0,
]
-// we create the buildModules task above so the distribution subprojects can
-// depend on it, but we don't actually configure it until here so we can do a single
+// we create the buildOssModules task above but fill it here so we can do a single
// loop over modules to also setup cross task dependencies and increment our modules counter
project.rootProject.subprojects.findAll { it.parent.path == ':modules' }.each { Project module ->
- buildFullNotice {
- def defaultLicensesDir = new File(module.projectDir, 'licenses')
- if (defaultLicensesDir.exists()) {
- licensesDir defaultLicensesDir
- }
+ File licenses = new File(module.projectDir, 'licenses')
+ if (licenses.exists()) {
+ buildDefaultNotice.licensesDir licenses
+ buildOssNotice.licensesDir licenses
}
- buildModules {
- dependsOn({ project(module.path).bundlePlugin })
- into(module.name) {
- from { zipTree(project(module.path).bundlePlugin.outputs.files.singleFile) }
- }
+
+ copyModule(processOssOutputs, module)
+ if (module.name.startsWith('transport-')) {
+ copyModule(processTransportOutputs, module)
}
+
+ copyLog4jProperties(buildOssLog4jConfig, module)
+ copyLog4jProperties(buildDefaultLog4jConfig, module)
+
// make sure the module's integration tests run after the integ-test-zip (ie rest tests)
module.afterEvaluate({
module.integTest.mustRunAfter(':distribution:archives:integ-test-zip:integTest')
@@ -89,20 +199,19 @@ project.rootProject.subprojects.findAll { it.parent.path == ':modules' }.each {
restTestExpansions['expected.modules.count'] += 1
}
-// Integ tests work over the rest http layer, so we need a transport included with the integ test zip.
-// All transport modules are included so that they may be randomized for testing
-task buildTransportModules(type: Sync) {
- into 'build/transport-modules'
-}
-
-project.rootProject.subprojects.findAll { it.path.startsWith(':modules:transport-') }.each { Project transport ->
- buildTransportModules {
- dependsOn({ project(transport.path).bundlePlugin })
- into(transport.name) {
- from { zipTree(project(transport.path).bundlePlugin.outputs.files.singleFile) }
- }
+// use licenses from each of the bundled xpack plugins
+Project xpack = project(':x-pack:plugin')
+xpack.subprojects.findAll { it.name != 'bwc' }.each { Project xpackSubproject ->
+ File licenses = new File(xpackSubproject.projectDir, 'licenses')
+ if (licenses.exists()) {
+ buildDefaultNotice.licensesDir licenses
}
}
+// but copy just the top level meta plugin to the default modules
+copyModule(processDefaultOutputs, xpack)
+copyLog4jProperties(buildDefaultLog4jConfig, xpack)
+
+//
// make sure we have a clean task since we aren't a java project, but we have tasks that
// put stuff in the build dir
@@ -130,45 +239,71 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) {
from { project(':distribution:tools:plugin-cli').jar }
}
- modulesFiles = copySpec {
- into 'modules'
- from project(':distribution').buildModules
+ modulesFiles = { oss ->
+ copySpec {
+ eachFile {
+ if (it.relativePath.segments[-2] == 'bin') {
+ // bin files, wherever they are within modules (eg platform specific) should be executable
+ it.mode = 0755
+ }
+ }
+ if (oss) {
+ from project(':distribution').buildOssModules
+ } else {
+ from project(':distribution').buildDefaultModules
+ }
+ }
}
transportModulesFiles = copySpec {
- into "modules"
from project(':distribution').buildTransportModules
}
- configFiles = { distributionType ->
+ configFiles = { distributionType, oss ->
copySpec {
- from '../src/config'
- MavenFilteringHack.filter(it, expansionsForDistribution(distributionType))
+ with copySpec {
+ // main config files, processed with distribution specific substitutions
+ from '../src/config'
+ exclude 'log4j2.properties' // this is handled separately below
+ MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, oss))
+ }
+ if (oss) {
+ from project(':distribution').buildOssLog4jConfig
+ from project(':distribution').buildOssConfig
+ } else {
+ from project(':distribution').buildDefaultLog4jConfig
+ from project(':distribution').buildDefaultConfig
+ }
}
}
- binFiles = { distributionType ->
+ binFiles = { distributionType, oss ->
copySpec {
- // everything except windows files
- from '../src/bin'
- exclude '*.bat'
- exclude '*.exe'
- eachFile { it.setMode(0755) }
- MavenFilteringHack.filter(it, expansionsForDistribution(distributionType))
+ with copySpec {
+ // main bin files, processed with distribution specific substitutions
+ // everything except windows files
+ from '../src/bin'
+ exclude '*.exe'
+ exclude '*.bat'
+ eachFile { it.setMode(0755) }
+ MavenFilteringHack.filter(it, expansionsForDistribution(distributionType, oss))
+ }
+ with copySpec {
+ eachFile { it.setMode(0755) }
+ if (oss) {
+ from project(':distribution').buildOssBin
+ } else {
+ from project(':distribution').buildDefaultBin
+ }
+ }
}
}
- commonFiles = copySpec {
- from rootProject.projectDir
- include 'LICENSE.txt'
- include 'README.textile'
- }
-
noticeFile = copySpec {
if (project.name == 'integ-test-zip') {
- from buildCoreNotice
+ from buildServerNotice
} else {
- from buildFullNotice
+ from buildDefaultNotice
}
}
}
@@ -176,7 +311,7 @@ configure(subprojects.findAll { ['archives', 'packages'].contains(it.name) }) {
}
task run(type: RunTask) {
- distribution = 'zip'
+ distribution = System.getProperty('run.distribution', 'zip')
}
/**
@@ -210,13 +345,22 @@ task run(type: RunTask) {
*
*/
subprojects {
- ext.expansionsForDistribution = { distributionType ->
+ ext.expansionsForDistribution = { distributionType, oss ->
final String defaultHeapSize = "1g"
final String packagingPathData = "path.data: /var/lib/elasticsearch"
final String pathLogs = "/var/log/elasticsearch"
final String packagingPathLogs = "path.logs: ${pathLogs}"
final String packagingLoggc = "${pathLogs}/gc.log"
+ String licenseText
+ if (oss) {
+ licenseText = rootProject.file('licenses/APACHE-LICENSE-2.0.txt').getText('UTF-8')
+ } else {
+ licenseText = rootProject.file('licenses/ELASTIC-LICENSE.txt').getText('UTF-8')
+ }
+ // license text needs to be indented with a single space
+ licenseText = ' ' + licenseText.replace('\n', '\n ')
+
String footer = "# Built for ${project.name}-${project.version} " +
"(${distributionType})"
Map expansions = [
@@ -281,6 +425,26 @@ subprojects {
'deb': "exit 0\n${footer}",
'def': footer
],
+
+ 'es.distribution.flavor': [
+ 'def': oss ? 'oss' : 'default'
+ ],
+
+
+ 'es.distribution.type': [
+ 'deb': 'deb',
+ 'rpm': 'rpm',
+ 'tar': 'tar',
+ 'zip': 'zip'
+ ],
+
+ 'license.name': [
+ 'deb': oss ? 'ASL-2.0' : 'Elastic-License'
+ ],
+
+ 'license.text': [
+ 'deb': licenseText,
+ ],
]
Map result = [:]
expansions = expansions.each { key, value ->
diff --git a/distribution/bwc/build.gradle b/distribution/bwc/build.gradle
index 48b84b4036240..42412c6230fa4 100644
--- a/distribution/bwc/build.gradle
+++ b/distribution/bwc/build.gradle
@@ -131,25 +131,28 @@ subprojects {
}
}
- String debDir = 'distribution/packages/deb'
- String rpmDir = 'distribution/packages/rpm'
- String zipDir = 'distribution/archives/zip'
- if (bwcVersion.before('6.3.0')) {
- debDir = 'distribution/deb'
- rpmDir = 'distribution/rpm'
- zipDir = 'distribution/zip'
+ List artifactFiles = []
+ List projectDirs = []
+ for (String project : ['zip', 'deb', 'rpm']) {
+ String baseDir = "distribution"
+ if (bwcVersion.onOrAfter('6.3.0')) {
+ baseDir += project == 'zip' ? '/archives' : '/packages'
+ // add oss variant first
+ projectDirs.add("${baseDir}/oss-${project}")
+ artifactFiles.add(file("${checkoutDir}/${baseDir}/oss-${project}/build/distributions/elasticsearch-oss-${bwcVersion}.${project}"))
+ }
+ projectDirs.add("${baseDir}/${project}")
+ artifactFiles.add(file("${checkoutDir}/${baseDir}/${project}/build/distributions/elasticsearch-${bwcVersion}.${project}"))
}
- File bwcDeb = file("${checkoutDir}/${debDir}/build/distributions/elasticsearch-${bwcVersion}.deb")
- File bwcRpm = file("${checkoutDir}/${rpmDir}/build/distributions/elasticsearch-${bwcVersion}.rpm")
- File bwcZip = file("${checkoutDir}/${zipDir}/build/distributions/elasticsearch-${bwcVersion}.zip")
+
task buildBwcVersion(type: Exec) {
dependsOn checkoutBwcBranch, writeBuildMetadata
workingDir = checkoutDir
if (["5.6", "6.0", "6.1"].contains(bwcBranch)) {
// we are building branches that are officially built with JDK 8, push JAVA8_HOME to JAVA_HOME for these builds
- environment('JAVA_HOME', "${-> getJavaHome(project, 8, "JAVA8_HOME is required to build BWC versions for BWC branch [" + bwcBranch + "]")}")
+ environment('JAVA_HOME', getJavaHome(it, 8))
} else if ("6.2".equals(bwcBranch)) {
- environment('JAVA_HOME', "${-> getJavaHome(project, 9, "JAVA9_HOME is required to build BWC versions for BWC branch [" + bwcBranch + "]")}")
+ environment('JAVA_HOME', getJavaHome(it, 9))
} else {
environment('JAVA_HOME', project.compilerJavaHome)
}
@@ -159,7 +162,10 @@ subprojects {
} else {
executable new File(checkoutDir, 'gradlew').toString()
}
- args ":${debDir.replace('/', ':')}:assemble", ":${rpmDir.replace('/', ':')}:assemble", ":${zipDir.replace('/', ':')}:assemble", "-Dbuild.snapshot=true"
+ for (String dir : projectDirs) {
+ args ":${dir.replace('/', ':')}:assemble"
+ }
+ args "-Dbuild.snapshot=true"
final LogLevel logLevel = gradle.startParameter.logLevel
if ([LogLevel.QUIET, LogLevel.WARN, LogLevel.INFO, LogLevel.DEBUG].contains(logLevel)) {
args "--${logLevel.name().toLowerCase(Locale.ENGLISH)}"
@@ -172,7 +178,7 @@ subprojects {
args "--full-stacktrace"
}
doLast {
- List missing = [bwcDeb, bwcRpm, bwcZip].grep { file ->
+ List missing = artifactFiles.grep { file ->
false == file.exists()
}
if (false == missing.empty) {
@@ -183,8 +189,10 @@ subprojects {
}
artifacts {
- 'default' file: bwcDeb, name: 'elasticsearch', type: 'deb', builtBy: buildBwcVersion
- 'default' file: bwcRpm, name: 'elasticsearch', type: 'rpm', builtBy: buildBwcVersion
- 'default' file: bwcZip, name: 'elasticsearch', type: 'zip', builtBy: buildBwcVersion
+ for (File artifactFile : artifactFiles) {
+ String artifactName = artifactFile.name.contains('oss') ? 'elasticsearch-oss' : 'elasticsearch'
+ String suffix = artifactFile.toString()[-3..-1]
+ 'default' file: artifactFile, name: artifactName, type: suffix, builtBy: buildBwcVersion
+ }
}
}
diff --git a/distribution/packages/build.gradle b/distribution/packages/build.gradle
index 6c5d149a10a31..33f98386a8987 100644
--- a/distribution/packages/build.gradle
+++ b/distribution/packages/build.gradle
@@ -15,9 +15,15 @@
* KIND, either express or implied. See the License for the
*/
+
import org.elasticsearch.gradle.LoggedExec
import org.elasticsearch.gradle.MavenFilteringHack
+import java.nio.file.Files
+import java.nio.file.Path
+import java.util.regex.Matcher
+import java.util.regex.Pattern
+
/*****************************************************************************
* Deb and rpm configuration *
*****************************************************************************
@@ -54,19 +60,22 @@ buildscript {
}
}
-void addProcessFilesTask(String type) {
- String packagingFiles = "build/packaging/${type}"
+void addProcessFilesTask(String type, boolean oss) {
+ String packagingFiles = "build/packaging/${ oss ? 'oss-' : ''}${type}"
- task("process${type.capitalize()}Files", type: Copy) {
- from 'src/common'
- from "src/${type}"
+ task("process${oss ? 'Oss' : ''}${type.capitalize()}Files", type: Copy) {
into packagingFiles
- into('config') {
- from '../src/config'
+ with copySpec {
+ from 'src/common'
+ from "src/${type}"
+ MavenFilteringHack.filter(it, expansionsForDistribution(type, oss))
}
- MavenFilteringHack.filter(it, expansionsForDistribution(type))
+ into('config') {
+ with configFiles(type, oss)
+ }
+ MavenFilteringHack.filter(it, expansionsForDistribution(type, oss))
doLast {
// create empty dirs, we set the permissions when configuring the packages
@@ -77,19 +86,24 @@ void addProcessFilesTask(String type) {
}
}
}
-addProcessFilesTask('deb')
-addProcessFilesTask('rpm')
+addProcessFilesTask('deb', true)
+addProcessFilesTask('deb', false)
+addProcessFilesTask('rpm', true)
+addProcessFilesTask('rpm', false)
// Common configuration that is package dependent. This can't go in ospackage
// since we have different templated files that need to be consumed, but the structure
// is the same
-Closure commonPackageConfig(String type) {
+Closure commonPackageConfig(String type, boolean oss) {
return {
+ dependsOn "process${oss ? 'Oss' : ''}${type.capitalize()}Files"
+ packageName "elasticsearch${oss ? '-oss' : ''}"
// Follow elasticsearch's file naming convention
- archiveName "elasticsearch-${project.version}.${type}"
+ archiveName "${packageName}-${project.version}.${type}"
- destinationDir = file("${type}/build/distributions")
- String packagingFiles = "build/packaging/${type}"
+ String prefix = "${oss ? 'oss-' : ''}${type}"
+ destinationDir = file("${prefix}/build/distributions")
+ String packagingFiles = "build/packaging/${prefix}"
String scripts = "${packagingFiles}/scripts"
preInstall file("${scripts}/preinst")
@@ -104,13 +118,40 @@ Closure commonPackageConfig(String type) {
// specify it again explicitly for copying common files
into('/usr/share/elasticsearch') {
into('bin') {
- with binFiles(type)
+ with binFiles(type, oss)
+ }
+ from(rootProject.projectDir) {
+ include 'README.textile'
+ }
+ into('modules') {
+ with copySpec {
+ with modulesFiles(oss)
+ // we need to specify every intermediate directory, but modules could have sub directories
+ // and there might not be any files as direct children of intermediates (eg platform)
+ // so we must iterate through the parents, but duplicate calls with the same path
+ // are ok (they don't show up in the built packages)
+ eachFile { FileCopyDetails fcp ->
+ String[] segments = fcp.relativePath.segments
+ for (int i = segments.length - 2; i > 0 && segments[i] != 'modules'; --i) {
+ directory('/' + segments[0..i].join('/'), 0755)
+ }
+ }
+ }
+ }
+ }
+
+ // license files
+ if (type == 'deb') {
+ into("/usr/share/doc/${packageName}") {
+ from "${packagingFiles}/copyright"
+ fileMode 0644
}
- with copySpec {
- with commonFiles
- if (type == 'deb') {
- // Deb gets a copyright file instead.
- exclude 'LICENSE.txt'
+ } else {
+ assert type == 'rpm'
+ into('/usr/share/elasticsearch') {
+ from(rootProject.file('licenses')) {
+ include oss ? 'APACHE-LICENSE-2.0.txt' : 'ELASTIC-LICENSE.txt'
+ rename { 'LICENSE.txt' }
}
}
}
@@ -120,7 +161,7 @@ Closure commonPackageConfig(String type) {
configurationFile '/etc/elasticsearch/jvm.options'
configurationFile '/etc/elasticsearch/log4j2.properties'
into('/etc/elasticsearch') {
- //dirMode 0750
+ dirMode 0750
fileMode 0660
permissionGroup 'elasticsearch'
includeEmptyDirs true
@@ -128,7 +169,7 @@ Closure commonPackageConfig(String type) {
fileType CONFIG | NOREPLACE
from "${packagingFiles}/config"
}
- String envFile = expansionsForDistribution(type)['path.env']
+ String envFile = expansionsForDistribution(type, false)['path.env']
configurationFile envFile
into(new File(envFile).getParent()) {
fileType CONFIG | NOREPLACE
@@ -176,6 +217,9 @@ Closure commonPackageConfig(String type) {
copyEmptyDir('/var/log/elasticsearch', 'elasticsearch', 'elasticsearch', 0750)
copyEmptyDir('/var/lib/elasticsearch', 'elasticsearch', 'elasticsearch', 0750)
copyEmptyDir('/usr/share/elasticsearch/plugins', 'root', 'root', 0755)
+
+ // the oss package conflicts with the default distribution and vice versa
+ conflicts('elasticsearch' + (oss ? '' : '-oss'))
}
}
@@ -183,7 +227,6 @@ apply plugin: 'nebula.ospackage-base'
// this is package indepdendent configuration
ospackage {
- packageName 'elasticsearch'
maintainer 'Elasticsearch Team '
summary '''
Elasticsearch is a distributed RESTful search engine built for the cloud.
@@ -212,96 +255,88 @@ ospackage {
into '/usr/share/elasticsearch'
with libFiles
- with modulesFiles
with noticeFile
}
-task buildDeb(type: Deb) {
- dependsOn processDebFiles
- configure(commonPackageConfig('deb'))
+Closure commonDebConfig(boolean oss) {
+ return {
+ configure(commonPackageConfig('deb', oss))
+
+ // jdeb does not provide a way to set the License control attribute, and ospackage
+ // silently ignores setting it. Instead, we set the license as "custom field"
+ if (oss) {
+ customFields['License'] = 'ASL-2.0'
+ } else {
+ customFields['License'] = 'Elastic-License'
+ }
- version = project.version
- packageGroup 'web'
- requires 'bash'
- requires 'libc6'
- requires 'adduser'
+ version = project.version.replace('-', '~')
+ packageGroup 'web'
+ requires 'bash'
+ requires 'libc6'
+ requires 'adduser'
- into('/usr/share/lintian/overrides') {
- from('src/deb/lintian/elasticsearch')
- }
- into('/usr/share/doc/elasticsearch') {
- from 'src/deb/copyright'
- fileMode 0644
+ into('/usr/share/lintian/overrides') {
+ from('src/deb/lintian/elasticsearch')
+ }
}
}
-// task that sanity checks if the Deb archive can be extracted
-task checkDeb(type: LoggedExec) {
- dependsOn buildDeb
- onlyIf { new File('/usr/bin/dpkg-deb').exists() || new File('/usr/local/bin/dpkg-deb').exists() }
- final File debExtracted = new File("${buildDir}", 'deb-extracted')
- commandLine 'dpkg-deb', '-x', "deb/build/distributions/elasticsearch-${project.version}.deb", debExtracted
- doFirst {
- debExtracted.deleteDir()
- }
+task buildDeb(type: Deb) {
+ configure(commonDebConfig(false))
}
-task buildRpm(type: Rpm) {
- dependsOn processRpmFiles
- configure(commonPackageConfig('rpm'))
-
- packageGroup 'Application/Internet'
- requires '/bin/bash'
-
- prefix '/usr'
- packager 'Elasticsearch'
- version = project.version.replace('-', '_')
- release = '1'
- arch 'NOARCH'
- os 'LINUX'
- license '2009'
- distribution 'Elasticsearch'
- vendor 'Elasticsearch'
- // TODO ospackage doesn't support icon but we used to have one
-
- // without this the rpm will have parent dirs of any files we copy in, eg /etc/elasticsearch
- addParentDirs false
-
- // Declare the folders so that the RPM package manager removes
- // them when upgrading or removing the package
- directory('/usr/share/elasticsearch/bin', 0755)
- directory('/usr/share/elasticsearch/lib', 0755)
- directory('/usr/share/elasticsearch/modules', 0755)
- modulesFiles.eachFile { FileCopyDetails fcp ->
- if (fcp.name == "plugin-descriptor.properties") {
- directory('/usr/share/elasticsearch/modules/' + fcp.file.parentFile.name, 0755)
+task buildOssDeb(type: Deb) {
+ configure(commonDebConfig(true))
+}
+
+Closure commonRpmConfig(boolean oss) {
+ return {
+ configure(commonPackageConfig('rpm', oss))
+
+ if (oss) {
+ license 'ASL 2.0'
+ } else {
+ license 'Elastic License'
}
+
+ packageGroup 'Application/Internet'
+ requires '/bin/bash'
+
+ prefix '/usr'
+ packager 'Elasticsearch'
+ version = project.version.replace('-', '_')
+ release = '1'
+ arch 'NOARCH'
+ os 'LINUX'
+ distribution 'Elasticsearch'
+ vendor 'Elasticsearch'
+ // TODO ospackage doesn't support icon but we used to have one
+
+ // without this the rpm will have parent dirs of any files we copy in, eg /etc/elasticsearch
+ addParentDirs false
+
+ // Declare the folders so that the RPM package manager removes
+ // them when upgrading or removing the package
+ directory('/usr/share/elasticsearch/bin', 0755)
+ directory('/usr/share/elasticsearch/lib', 0755)
+ directory('/usr/share/elasticsearch/modules', 0755)
}
}
-// task that sanity checks if the RPM archive can be extracted
-task checkRpm(type: LoggedExec) {
- dependsOn buildRpm
- onlyIf { new File('/bin/rpm').exists() || new File('/usr/bin/rpm').exists() || new File('/usr/local/bin/rpm').exists() }
- final File rpmDatabase = new File("${buildDir}", 'rpm-database')
- final File rpmExtracted = new File("${buildDir}", 'rpm-extracted')
- commandLine 'rpm',
- '--badreloc',
- '--nodeps',
- '--noscripts',
- '--notriggers',
- '--dbpath',
- rpmDatabase,
- '--relocate',
- "/=${rpmExtracted}",
- '-i',
- "rpm/build/distributions/elasticsearch-${project.version}.rpm"
- doFirst {
- rpmDatabase.deleteDir()
- rpmExtracted.deleteDir()
- }
+task buildRpm(type: Rpm) {
+ configure(commonRpmConfig(false))
+}
+
+task buildOssRpm(type: Rpm) {
+ configure(commonRpmConfig(true))
}
+Closure dpkgExists = { it -> new File('/bin/dpkg-deb').exists() || new File('/usr/bin/dpkg-deb').exists() || new File('/usr/local/bin/dpkg-deb').exists() }
+Closure rpmExists = { it -> new File('/bin/rpm').exists() || new File('/usr/bin/rpm').exists() || new File('/usr/local/bin/rpm').exists() }
+
+Closure debFilter = { f -> f.name.endsWith('.deb') }
+
// This configures the default artifact for the distribution specific
// subprojects. We have subprojects because Gradle project substitutions
// can only bind to the default configuration of a project
@@ -313,7 +348,164 @@ subprojects {
artifacts {
'default' buildDist
}
-}
-check.dependsOn checkDeb, checkRpm
+ // sanity checks if packages can be extracted
+ final File extractionDir = new File(buildDir, 'extracted')
+ final File packageExtractionDir
+ if (project.name.contains('deb')) {
+ packageExtractionDir = new File(extractionDir, 'deb-extracted')
+ } else {
+ assert project.name.contains('rpm')
+ packageExtractionDir = new File(extractionDir, 'rpm-extracted')
+ }
+ task checkExtraction(type: LoggedExec) {
+ dependsOn buildDist
+ doFirst {
+ project.delete(extractionDir)
+ extractionDir.mkdirs()
+ }
+ }
+ check.dependsOn checkExtraction
+ if (project.name.contains('deb')) {
+ checkExtraction {
+ onlyIf dpkgExists
+ commandLine 'dpkg-deb', '-x', "${-> buildDist.outputs.files.filter(debFilter).singleFile}", packageExtractionDir
+ }
+ } else {
+ assert project.name.contains('rpm')
+ checkExtraction {
+ onlyIf rpmExists
+ final File rpmDatabase = new File(extractionDir, 'rpm-database')
+ commandLine 'rpm',
+ '--badreloc',
+ '--nodeps',
+ '--noscripts',
+ '--notriggers',
+ '--dbpath',
+ rpmDatabase,
+ '--relocate',
+ "/=${packageExtractionDir}",
+ '-i',
+ "${-> buildDist.outputs.files.singleFile}"
+ }
+ }
+
+ task checkLicense {
+ dependsOn buildDist, checkExtraction
+ }
+ check.dependsOn checkLicense
+ if (project.name.contains('deb')) {
+ checkLicense {
+ onlyIf dpkgExists
+ doLast {
+ final Path copyrightPath
+ final String expectedLicense
+ final String licenseFilename
+ if (project.name.contains('oss-')) {
+ copyrightPath = packageExtractionDir.toPath().resolve("usr/share/doc/elasticsearch-oss/copyright")
+ expectedLicense = "ASL-2.0"
+ licenseFilename = "APACHE-LICENSE-2.0.txt"
+ } else {
+ copyrightPath = packageExtractionDir.toPath().resolve("usr/share/doc/elasticsearch/copyright")
+ expectedLicense = "Elastic-License"
+ licenseFilename = "ELASTIC-LICENSE.txt"
+ }
+ final List header = Arrays.asList("Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/",
+ "Copyright: Elasticsearch B.V. ",
+ "License: " + expectedLicense)
+ final List licenseLines = Files.readAllLines(rootDir.toPath().resolve("licenses/" + licenseFilename))
+ final List expectedLines = header + licenseLines.collect { " " + it }
+ assertLinesInFile(copyrightPath, expectedLines)
+ }
+ }
+ } else {
+ assert project.name.contains('rpm')
+ checkLicense {
+ onlyIf rpmExists
+ doLast {
+ final String licenseFilename
+ if (project.name.contains('oss-')) {
+ licenseFilename = "APACHE-LICENSE-2.0.txt"
+ } else {
+ licenseFilename = "ELASTIC-LICENSE.txt"
+ }
+ final List licenseLines = Files.readAllLines(rootDir.toPath().resolve("licenses/" + licenseFilename))
+ final Path licensePath = packageExtractionDir.toPath().resolve("usr/share/elasticsearch/LICENSE.txt")
+ assertLinesInFile(licensePath, licenseLines)
+ }
+ }
+ }
+
+ task checkNotice {
+ dependsOn buildDist, checkExtraction
+ onlyIf { (project.name.contains('deb') && dpkgExists.call(it)) || (project.name.contains('rpm') && rpmExists.call(it)) }
+ doLast {
+ final List noticeLines = Arrays.asList("Elasticsearch", "Copyright 2009-2018 Elasticsearch")
+ final Path noticePath = packageExtractionDir.toPath().resolve("usr/share/elasticsearch/NOTICE.txt")
+ assertLinesInFile(noticePath, noticeLines)
+ }
+ }
+ check.dependsOn checkNotice
+
+ task checkLicenseMetadata(type: LoggedExec) {
+ dependsOn buildDist, checkExtraction
+ }
+ check.dependsOn checkLicenseMetadata
+ if (project.name.contains('deb')) {
+ checkLicenseMetadata { LoggedExec exec ->
+ onlyIf dpkgExists
+ final ByteArrayOutputStream output = new ByteArrayOutputStream()
+ exec.commandLine 'dpkg-deb', '--info', "${ -> buildDist.outputs.files.filter(debFilter).singleFile}"
+ exec.standardOutput = output
+ doLast {
+ final String expectedLicense
+ if (project.name.contains('oss-')) {
+ expectedLicense = "ASL-2.0"
+ } else {
+ expectedLicense = "Elastic-License"
+ }
+ final Pattern pattern = Pattern.compile("\\s*License: (.+)")
+ final String info = output.toString('UTF-8')
+ final String[] actualLines = info.split("\n")
+ int count = 0
+ for (final String actualLine : actualLines) {
+ final Matcher matcher = pattern.matcher(actualLine)
+ if (matcher.matches()) {
+ count++
+ final String actualLicense = matcher.group(1)
+ if (expectedLicense != actualLicense) {
+ throw new GradleException("expected license [${expectedLicense} for package info but found [${actualLicense}]")
+ }
+ }
+ }
+ if (count == 0) {
+ throw new GradleException("expected license [${expectedLicense}] for package info but found none in:\n${info}")
+ }
+ if (count > 1) {
+ throw new GradleException("expected a single license for package info but found [${count}] in:\n${info}")
+ }
+ }
+ }
+ } else {
+ assert project.name.contains('rpm')
+ checkLicenseMetadata { LoggedExec exec ->
+ onlyIf rpmExists
+ final ByteArrayOutputStream output = new ByteArrayOutputStream()
+ exec.commandLine 'rpm', '-qp', '--queryformat', '%{License}', "${-> buildDist.outputs.files.singleFile}"
+ exec.standardOutput = output
+ doLast {
+ final String license = output.toString('UTF-8')
+ final String expectedLicense
+ if (project.name.contains('oss-')) {
+ expectedLicense = "ASL 2.0"
+ } else {
+ expectedLicense = "Elastic License"
+ }
+ if (license != expectedLicense) {
+ throw new GradleException("expected license [${expectedLicense}] for [${-> buildDist.outputs.files.singleFile}] but was [${license}]")
+ }
+ }
+ }
+ }
+}
diff --git a/distribution/packages/oss-deb/build.gradle b/distribution/packages/oss-deb/build.gradle
new file mode 100644
index 0000000000000..4a6dde5fc0c92
--- /dev/null
+++ b/distribution/packages/oss-deb/build.gradle
@@ -0,0 +1,2 @@
+// This file is intentionally blank. All configuration of the
+// distribution is done in the parent project.
diff --git a/distribution/packages/oss-rpm/build.gradle b/distribution/packages/oss-rpm/build.gradle
new file mode 100644
index 0000000000000..4a6dde5fc0c92
--- /dev/null
+++ b/distribution/packages/oss-rpm/build.gradle
@@ -0,0 +1,2 @@
+// This file is intentionally blank. All configuration of the
+// distribution is done in the parent project.
diff --git a/distribution/packages/src/deb/copyright b/distribution/packages/src/deb/copyright
index 98a923677c907..44c7582666f21 100644
--- a/distribution/packages/src/deb/copyright
+++ b/distribution/packages/src/deb/copyright
@@ -1,17 +1,4 @@
-Copyright 2013-2018 Elasticsearch
-
-License: Apache-2.0
- Licensed under the Apache License, Version 2.0 (the "License");
- you may not use this file except in compliance with the License.
- You may obtain a copy of the License at
- .
- http://www.apache.org/licenses/LICENSE-2.0
- .
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
- .
- On Debian systems, the complete text of the Apache version 2.0 license
- can be found in "/usr/share/common-licenses/Apache-2.0".
+Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
+Copyright: Elasticsearch B.V.
+License: ${license.name}
+${license.text}
diff --git a/distribution/src/bin/elasticsearch b/distribution/src/bin/elasticsearch
index 11efddf6e2678..84e14eea3f6f8 100755
--- a/distribution/src/bin/elasticsearch
+++ b/distribution/src/bin/elasticsearch
@@ -28,6 +28,8 @@ if ! echo $* | grep -E '(^-d |-d$| -d |--daemonize$|--daemonize )' > /dev/null;
$ES_JAVA_OPTS \
-Des.path.home="$ES_HOME" \
-Des.path.conf="$ES_PATH_CONF" \
+ -Des.distribution.flavor="$ES_DISTRIBUTION_FLAVOR" \
+ -Des.distribution.type="$ES_DISTRIBUTION_TYPE" \
-cp "$ES_CLASSPATH" \
org.elasticsearch.bootstrap.Elasticsearch \
"$@"
@@ -37,6 +39,8 @@ else
$ES_JAVA_OPTS \
-Des.path.home="$ES_HOME" \
-Des.path.conf="$ES_PATH_CONF" \
+ -Des.distribution.flavor="$ES_DISTRIBUTION_FLAVOR" \
+ -Des.distribution.type="$ES_DISTRIBUTION_TYPE" \
-cp "$ES_CLASSPATH" \
org.elasticsearch.bootstrap.Elasticsearch \
"$@" \
diff --git a/distribution/src/bin/elasticsearch-env b/distribution/src/bin/elasticsearch-env
index cc86a10b184ae..9d58d88e7aaf1 100644
--- a/distribution/src/bin/elasticsearch-env
+++ b/distribution/src/bin/elasticsearch-env
@@ -77,6 +77,9 @@ fi
# now make ES_PATH_CONF absolute
ES_PATH_CONF=`cd "$ES_PATH_CONF"; pwd`
+ES_DISTRIBUTION_FLAVOR=${es.distribution.flavor}
+ES_DISTRIBUTION_TYPE=${es.distribution.type}
+
if [ -z "$ES_TMPDIR" ]; then
set +e
mktemp --version 2>&1 | grep coreutils > /dev/null
diff --git a/distribution/src/bin/elasticsearch-env.bat b/distribution/src/bin/elasticsearch-env.bat
index 2499c0d99a4da..b0d015924b440 100644
--- a/distribution/src/bin/elasticsearch-env.bat
+++ b/distribution/src/bin/elasticsearch-env.bat
@@ -53,6 +53,9 @@ if not defined ES_PATH_CONF (
rem now make ES_PATH_CONF absolute
for %%I in ("%ES_PATH_CONF%..") do set ES_PATH_CONF=%%~dpfI
+set ES_DISTRIBUTION_FLAVOR=${es.distribution.flavor}
+set ES_DISTRIBUTION_TYPE=${es.distribution.type}
+
if not defined ES_TMPDIR (
set ES_TMPDIR=!TMP!\elasticsearch
)
diff --git a/distribution/src/bin/elasticsearch-keystore b/distribution/src/bin/elasticsearch-keystore
index 8797e7c07a613..aee62dfde50d4 100755
--- a/distribution/src/bin/elasticsearch-keystore
+++ b/distribution/src/bin/elasticsearch-keystore
@@ -7,6 +7,8 @@ exec \
$ES_JAVA_OPTS \
-Des.path.home="$ES_HOME" \
-Des.path.conf="$ES_PATH_CONF" \
+ -Des.distribution.flavor="$ES_DISTRIBUTION_FLAVOR" \
+ -Des.distribution.type="$ES_DISTRIBUTION_TYPE" \
-cp "$ES_CLASSPATH" \
org.elasticsearch.common.settings.KeyStoreCli \
"$@"
diff --git a/distribution/src/bin/elasticsearch-keystore.bat b/distribution/src/bin/elasticsearch-keystore.bat
index 7e131a80a1b6c..1d6616983d8cc 100644
--- a/distribution/src/bin/elasticsearch-keystore.bat
+++ b/distribution/src/bin/elasticsearch-keystore.bat
@@ -9,6 +9,8 @@ call "%~dp0elasticsearch-env.bat" || exit /b 1
%ES_JAVA_OPTS% ^
-Des.path.home="%ES_HOME%" ^
-Des.path.conf="%ES_PATH_CONF%" ^
+ -Des.distribution.flavor="%ES_DISTRIBUTION_FLAVOR%" ^
+ -Des.distribution.type="%ES_DISTRIBUTION_TYPE%" ^
-cp "%ES_CLASSPATH%" ^
org.elasticsearch.common.settings.KeyStoreCli ^
%*
diff --git a/distribution/src/bin/elasticsearch-plugin b/distribution/src/bin/elasticsearch-plugin
index a2e228d490af5..500fd710c1aea 100755
--- a/distribution/src/bin/elasticsearch-plugin
+++ b/distribution/src/bin/elasticsearch-plugin
@@ -7,6 +7,8 @@ exec \
$ES_JAVA_OPTS \
-Des.path.home="$ES_HOME" \
-Des.path.conf="$ES_PATH_CONF" \
+ -Des.distribution.flavor="$ES_DISTRIBUTION_FLAVOR" \
+ -Des.distribution.type="$ES_DISTRIBUTION_TYPE" \
-cp "$ES_CLASSPATH" \
org.elasticsearch.plugins.PluginCli \
"$@"
diff --git a/distribution/src/bin/elasticsearch-plugin.bat b/distribution/src/bin/elasticsearch-plugin.bat
index 1d059aaaceee9..b3b94a31863f1 100644
--- a/distribution/src/bin/elasticsearch-plugin.bat
+++ b/distribution/src/bin/elasticsearch-plugin.bat
@@ -9,6 +9,8 @@ call "%~dp0elasticsearch-env.bat" || exit /b 1
%ES_JAVA_OPTS% ^
-Des.path.home="%ES_HOME%" ^
-Des.path.conf="%ES_PATH_CONF%" ^
+ -Des.distribution.flavor="%ES_DISTRIBUTION_FLAVOR%" ^
+ -Des.distribution.type="%ES_DISTRIBUTION_TYPE%" ^
-cp "%ES_CLASSPATH%" ^
org.elasticsearch.plugins.PluginCli ^
%*
diff --git a/distribution/src/bin/elasticsearch-service.bat b/distribution/src/bin/elasticsearch-service.bat
index e4f3e92b084c4..a1d0f04560e70 100644
--- a/distribution/src/bin/elasticsearch-service.bat
+++ b/distribution/src/bin/elasticsearch-service.bat
@@ -159,7 +159,7 @@ if "%JVM_SS%" == "" (
goto:eof
)
-set ES_PARAMS=-Delasticsearch;-Des.path.home="%ES_HOME%";-Des.path.conf="%ES_PATH_CONF%"
+set ES_PARAMS=-Delasticsearch;-Des.path.home="%ES_HOME%";-Des.path.conf="%ES_PATH_CONF%";-Des.distribution.flavor="%ES_DISTRIBUTION_FLAVOR%";-Des.distribution.type="%ES_DISTRIBUTION_TYPE%"
if "%ES_START_TYPE%" == "" set ES_START_TYPE=manual
if "%ES_STOP_TIMEOUT%" == "" set ES_STOP_TIMEOUT=0
diff --git a/distribution/src/bin/elasticsearch-translog b/distribution/src/bin/elasticsearch-translog
index dcb52c29ea381..e176231c6f44d 100755
--- a/distribution/src/bin/elasticsearch-translog
+++ b/distribution/src/bin/elasticsearch-translog
@@ -7,6 +7,8 @@ exec \
$ES_JAVA_OPTS \
-Des.path.home="$ES_HOME" \
-Des.path.conf="$ES_PATH_CONF" \
+ -Des.distribution.flavor="$ES_DISTRIBUTION_FLAVOR" \
+ -Des.distribution.type="$ES_DISTRIBUTION_TYPE" \
-cp "$ES_CLASSPATH" \
org.elasticsearch.index.translog.TranslogToolCli \
"$@"
diff --git a/distribution/src/bin/elasticsearch-translog.bat b/distribution/src/bin/elasticsearch-translog.bat
index 4f15e9b379250..492c1f0831263 100644
--- a/distribution/src/bin/elasticsearch-translog.bat
+++ b/distribution/src/bin/elasticsearch-translog.bat
@@ -9,6 +9,8 @@ call "%~dp0elasticsearch-env.bat" || exit /b 1
%ES_JAVA_OPTS% ^
-Des.path.home="%ES_HOME%" ^
-Des.path.conf="%ES_PATH_CONF%" ^
+ -Des.distribution.flavor="%ES_DISTRIBUTION_FLAVOR%" ^
+ -Des.distribution.type="%ES_DISTRIBUTION_TYPE%" ^
-cp "%ES_CLASSPATH%" ^
org.elasticsearch.index.translog.TranslogToolCli ^
%*
diff --git a/distribution/src/bin/elasticsearch.bat b/distribution/src/bin/elasticsearch.bat
index e0f52c54c627f..6e268c9b13321 100644
--- a/distribution/src/bin/elasticsearch.bat
+++ b/distribution/src/bin/elasticsearch.bat
@@ -51,7 +51,7 @@ if "%MAYBE_JVM_OPTIONS_PARSER_FAILED%" == "jvm_options_parser_failed" (
)
cd /d "%ES_HOME%"
-%JAVA% %ES_JAVA_OPTS% -Delasticsearch -Des.path.home="%ES_HOME%" -Des.path.conf="%ES_PATH_CONF%" -cp "%ES_CLASSPATH%" "org.elasticsearch.bootstrap.Elasticsearch" !newparams!
+%JAVA% %ES_JAVA_OPTS% -Delasticsearch -Des.path.home="%ES_HOME%" -Des.path.conf="%ES_PATH_CONF%" -Des.distribution.flavor="%ES_DISTRIBUTION_FLAVOR%" -Des.distribution.type="%ES_DISTRIBUTION_TYPE%" -cp "%ES_CLASSPATH%" "org.elasticsearch.bootstrap.Elasticsearch" !newparams!
endlocal
endlocal
diff --git a/distribution/tools/plugin-cli/src/main/java/org/elasticsearch/plugins/InstallPluginCommand.java b/distribution/tools/plugin-cli/src/main/java/org/elasticsearch/plugins/InstallPluginCommand.java
index e1733e478b8c2..71c57f7f10135 100644
--- a/distribution/tools/plugin-cli/src/main/java/org/elasticsearch/plugins/InstallPluginCommand.java
+++ b/distribution/tools/plugin-cli/src/main/java/org/elasticsearch/plugins/InstallPluginCommand.java
@@ -21,10 +21,9 @@
import joptsimple.OptionSet;
import joptsimple.OptionSpec;
-
import org.apache.lucene.search.spell.LevensteinDistance;
import org.apache.lucene.util.CollectionUtil;
-import org.elasticsearch.core.internal.io.IOUtils;
+import org.elasticsearch.Build;
import org.elasticsearch.Version;
import org.elasticsearch.bootstrap.JarHell;
import org.elasticsearch.cli.EnvironmentAwareCommand;
@@ -35,7 +34,7 @@
import org.elasticsearch.common.SuppressForbidden;
import org.elasticsearch.common.collect.Tuple;
import org.elasticsearch.common.hash.MessageDigests;
-import org.elasticsearch.common.settings.KeyStoreWrapper;
+import org.elasticsearch.core.internal.io.IOUtils;
import org.elasticsearch.env.Environment;
import java.io.BufferedReader;
@@ -152,7 +151,6 @@ class InstallPluginCommand extends EnvironmentAwareCommand {
plugins.add(line.trim());
line = reader.readLine();
}
- plugins.add("x-pack");
OFFICIAL_PLUGINS = Collections.unmodifiableSet(plugins);
} catch (IOException e) {
throw new RuntimeException(e);
@@ -218,11 +216,32 @@ void execute(Terminal terminal, String pluginId, boolean isBatch, Environment en
throw new UserException(ExitCodes.USAGE, "plugin id is required");
}
+ if ("x-pack".equals(pluginId)) {
+ handleInstallXPack(buildFlavor());
+ }
+
Path pluginZip = download(terminal, pluginId, env.tmpFile());
Path extractedZip = unzip(pluginZip, env.pluginsFile());
install(terminal, isBatch, extractedZip, env);
}
+ Build.Flavor buildFlavor() {
+ return Build.CURRENT.flavor();
+ }
+
+ private static void handleInstallXPack(final Build.Flavor flavor) throws UserException {
+ switch (flavor) {
+ case DEFAULT:
+ throw new UserException(ExitCodes.CONFIG, "this distribution of Elasticsearch contains X-Pack by default");
+ case OSS:
+ throw new UserException(
+ ExitCodes.CONFIG,
+ "X-Pack is not available with the oss distribution; to use X-Pack features use the default distribution");
+ case UNKNOWN:
+ throw new IllegalStateException("your distribution is broken");
+ }
+ }
+
/** Downloads the plugin and returns the file it was downloaded to. */
private Path download(Terminal terminal, String pluginId, Path tmpDir) throws Exception {
if (OFFICIAL_PLUGINS.contains(pluginId)) {
@@ -571,6 +590,9 @@ private void verifyPluginName(Path pluginPath, String pluginName, Path candidate
/** Load information about the plugin, and verify it can be installed with no errors. */
private PluginInfo loadPluginInfo(Terminal terminal, Path pluginRoot, boolean isBatch, Environment env) throws Exception {
final PluginInfo info = PluginInfo.readFromProperties(pluginRoot);
+ if (info.hasNativeController()) {
+ throw new IllegalStateException("plugins can not have native controllers");
+ }
PluginsService.verifyCompatibility(info);
// checking for existing version of the plugin
@@ -659,19 +681,16 @@ private void installMetaPlugin(Terminal terminal, boolean isBatch, Path tmpRoot,
Set permissions = new HashSet<>();
final List pluginInfos = new ArrayList<>();
- boolean hasNativeController = false;
for (Path plugin : pluginPaths) {
final PluginInfo info = loadPluginInfo(terminal, plugin, isBatch, env);
pluginInfos.add(info);
- hasNativeController |= info.hasNativeController();
-
Path policy = plugin.resolve(PluginInfo.ES_PLUGIN_POLICY);
if (Files.exists(policy)) {
permissions.addAll(PluginSecurity.parsePermissions(policy, env.tmpFile()));
}
}
- PluginSecurity.confirmPolicyExceptions(terminal, permissions, hasNativeController, isBatch);
+ PluginSecurity.confirmPolicyExceptions(terminal, permissions, isBatch);
// move support files and rename as needed to prepare the exploded plugin for its final location
for (int i = 0; i < pluginPaths.size(); ++i) {
@@ -704,7 +723,7 @@ private void installPlugin(Terminal terminal, boolean isBatch, Path tmpRoot,
} else {
permissions = Collections.emptySet();
}
- PluginSecurity.confirmPolicyExceptions(terminal, permissions, info.hasNativeController(), isBatch);
+ PluginSecurity.confirmPolicyExceptions(terminal, permissions, isBatch);
final Path destination = env.pluginsFile().resolve(info.getName());
deleteOnFailure.add(destination);
diff --git a/distribution/tools/plugin-cli/src/test/java/org/elasticsearch/plugins/InstallPluginCommandTests.java b/distribution/tools/plugin-cli/src/test/java/org/elasticsearch/plugins/InstallPluginCommandTests.java
index 96e009b3462f1..5931e66cb9a5d 100644
--- a/distribution/tools/plugin-cli/src/test/java/org/elasticsearch/plugins/InstallPluginCommandTests.java
+++ b/distribution/tools/plugin-cli/src/test/java/org/elasticsearch/plugins/InstallPluginCommandTests.java
@@ -22,8 +22,8 @@
import com.carrotsearch.randomizedtesting.annotations.ParametersFactory;
import com.google.common.jimfs.Configuration;
import com.google.common.jimfs.Jimfs;
-
import org.apache.lucene.util.LuceneTestCase;
+import org.elasticsearch.Build;
import org.elasticsearch.Version;
import org.elasticsearch.cli.ExitCodes;
import org.elasticsearch.cli.MockTerminal;
@@ -35,7 +35,6 @@
import org.elasticsearch.common.io.FileSystemUtils;
import org.elasticsearch.common.io.PathUtils;
import org.elasticsearch.common.io.PathUtilsForTesting;
-import org.elasticsearch.common.settings.KeyStoreWrapper;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.env.Environment;
import org.elasticsearch.env.TestEnvironment;
@@ -479,6 +478,15 @@ public void testBuiltinModule() throws Exception {
assertInstallCleaned(env.v2());
}
+ public void testBuiltinXpackModule() throws Exception {
+ Tuple env = createEnv(fs, temp);
+ Path pluginDir = createPluginDir(temp);
+ String pluginZip = createPluginUrl("x-pack", pluginDir);
+ UserException e = expectThrows(UserException.class, () -> installPlugin(pluginZip, env.v1()));
+ assertTrue(e.getMessage(), e.getMessage().contains("is a system module"));
+ assertInstallCleaned(env.v2());
+ }
+
public void testJarHell() throws Exception {
// jar hell test needs a real filesystem
assumeTrue("real filesystem", isReal);
@@ -881,23 +889,33 @@ protected boolean addShutdownHook() {
}
}
- public void testOfficialPluginsIncludesXpack() throws Exception {
- MockTerminal terminal = new MockTerminal();
- new InstallPluginCommand() {
+ public void testInstallXPack() throws IOException {
+ runInstallXPackTest(Build.Flavor.DEFAULT, UserException.class, "this distribution of Elasticsearch contains X-Pack by default");
+ runInstallXPackTest(
+ Build.Flavor.OSS,
+ UserException.class,
+ "X-Pack is not available with the oss distribution; to use X-Pack features use the default distribution");
+ runInstallXPackTest(Build.Flavor.UNKNOWN, IllegalStateException.class, "your distribution is broken");
+ }
+
+ private void runInstallXPackTest(
+ final Build.Flavor flavor, final Class clazz, final String expectedMessage) throws IOException {
+ final InstallPluginCommand flavorCommand = new InstallPluginCommand() {
@Override
- protected boolean addShutdownHook() {
- return false;
+ Build.Flavor buildFlavor() {
+ return flavor;
}
- }.main(new String[] { "--help" }, terminal);
- assertTrue(terminal.getOutput(), terminal.getOutput().contains("x-pack"));
+ };
+
+ final Environment environment = createEnv(fs, temp).v2();
+ final T exception = expectThrows(clazz, () -> flavorCommand.execute(terminal, "x-pack", false, environment));
+ assertThat(exception, hasToString(containsString(expectedMessage)));
}
public void testInstallMisspelledOfficialPlugins() throws Exception {
Tuple env = createEnv(fs, temp);
- UserException e = expectThrows(UserException.class, () -> installPlugin("xpack", env.v1()));
- assertThat(e.getMessage(), containsString("Unknown plugin xpack, did you mean [x-pack]?"));
- e = expectThrows(UserException.class, () -> installPlugin("analysis-smartnc", env.v1()));
+ UserException e = expectThrows(UserException.class, () -> installPlugin("analysis-smartnc", env.v1()));
assertThat(e.getMessage(), containsString("Unknown plugin analysis-smartnc, did you mean [analysis-smartcn]?"));
e = expectThrows(UserException.class, () -> installPlugin("repository", env.v1()));
@@ -1224,42 +1242,16 @@ public void testMetaPluginPolicyConfirmation() throws Exception {
assertMetaPlugin("meta-plugin", "fake2", metaDir, env.v2());
}
- public void testNativeControllerConfirmation() throws Exception {
- Tuple env = createEnv(fs, temp);
- Path pluginDir = createPluginDir(temp);
- String pluginZip = createPluginUrl("fake", pluginDir, "has.native.controller", "true");
-
- assertPolicyConfirmation(env, pluginZip, "plugin forks a native controller");
- assertPlugin("fake", pluginDir, env.v2());
- }
-
- public void testMetaPluginNativeControllerConfirmation() throws Exception {
- Tuple env = createEnv(fs, temp);
- Path metaDir = createPluginDir(temp);
- Path fake1Dir = metaDir.resolve("fake1");
- Files.createDirectory(fake1Dir);
- writePlugin("fake1", fake1Dir, "has.native.controller", "true");
- Path fake2Dir = metaDir.resolve("fake2");
- Files.createDirectory(fake2Dir);
- writePlugin("fake2", fake2Dir);
- String pluginZip = createMetaPluginUrl("meta-plugin", metaDir);
-
- assertPolicyConfirmation(env, pluginZip, "plugin forks a native controller");
- assertMetaPlugin("meta-plugin", "fake1", metaDir, env.v2());
- assertMetaPlugin("meta-plugin", "fake2", metaDir, env.v2());
- }
-
- public void testNativeControllerAndPolicyConfirmation() throws Exception {
+ public void testPluginWithNativeController() throws Exception {
Tuple env = createEnv(fs, temp);
Path pluginDir = createPluginDir(temp);
- writePluginSecurityPolicy(pluginDir, "setAccessible", "setFactory");
String pluginZip = createPluginUrl("fake", pluginDir, "has.native.controller", "true");
- assertPolicyConfirmation(env, pluginZip, "plugin requires additional permissions", "plugin forks a native controller");
- assertPlugin("fake", pluginDir, env.v2());
+ final IllegalStateException e = expectThrows(IllegalStateException.class, () -> installPlugin(pluginZip, env.v1()));
+ assertThat(e, hasToString(containsString("plugins can not have native controllers")));
}
- public void testMetaPluginNativeControllerAndPolicyConfirmation() throws Exception {
+ public void testMetaPluginWithNativeController() throws Exception {
Tuple env = createEnv(fs, temp);
Path metaDir = createPluginDir(temp);
Path fake1Dir = metaDir.resolve("fake1");
@@ -1271,8 +1263,8 @@ public void testMetaPluginNativeControllerAndPolicyConfirmation() throws Excepti
writePlugin("fake2", fake2Dir, "has.native.controller", "true");
String pluginZip = createMetaPluginUrl("meta-plugin", metaDir);
- assertPolicyConfirmation(env, pluginZip, "plugin requires additional permissions", "plugin forks a native controller");
- assertMetaPlugin("meta-plugin", "fake1", metaDir, env.v2());
- assertMetaPlugin("meta-plugin", "fake2", metaDir, env.v2());
+ final IllegalStateException e = expectThrows(IllegalStateException.class, () -> installPlugin(pluginZip, env.v1()));
+ assertThat(e, hasToString(containsString("plugins can not have native controllers")));
}
+
}
diff --git a/docs/CHANGELOG.asciidoc b/docs/CHANGELOG.asciidoc
new file mode 100644
index 0000000000000..98be1db1b6d52
--- /dev/null
+++ b/docs/CHANGELOG.asciidoc
@@ -0,0 +1,46 @@
+// Use these for links to issue and pulls. Note issues and pulls redirect one to
+// each other on Github, so don't worry too much on using the right prefix.
+// :issue: https://github.com/elastic/elasticsearch/issues/
+// :pull: https://github.com/elastic/elasticsearch/pull/
+
+= Elasticsearch Release Notes
+
+== Elasticsearch 7.0.0
+
+=== Breaking Changes
+
+<> ({pull}29609[#29609])
+
+<> ({pull}29635[#29635])
+
+=== Breaking Java Changes
+
+=== Deprecations
+
+=== New Features
+
+=== Enhancements
+
+=== Bug Fixes
+
+Fail snapshot operations early when creating or deleting a snapshot on a repository that has been
+written to by an older Elasticsearch after writing to it with a newer Elasticsearch version. ({pull}30140[#30140])
+
+=== Regressions
+
+=== Known Issues
+
+== Elasticsearch version 6.4.0
+
+=== New Features
+
+=== Enhancements
+
+=== Bug Fixes
+
+=== Regressions
+
+=== Known Issues
+
+
diff --git a/docs/build.gradle b/docs/build.gradle
index 97094c6e79cbe..5057bead62d9b 100644
--- a/docs/build.gradle
+++ b/docs/build.gradle
@@ -20,6 +20,7 @@
apply plugin: 'elasticsearch.docs-test'
integTestCluster {
+ distribution = 'oss-zip'
/* Enable regexes in painless so our tests don't complain about example
* snippets that use them. */
setting 'script.painless.regex.enabled', 'true'
diff --git a/docs/java-rest/high-level/cluster/put_settings.asciidoc b/docs/java-rest/high-level/cluster/put_settings.asciidoc
index 74b479faa0501..dc9b1679d4717 100644
--- a/docs/java-rest/high-level/cluster/put_settings.asciidoc
+++ b/docs/java-rest/high-level/cluster/put_settings.asciidoc
@@ -54,13 +54,6 @@ include-tagged::{doc-tests}/ClusterClientDocumentationIT.java[put-settings-setti
==== Optional Arguments
The following arguments can optionally be provided:
-["source","java",subs="attributes,callouts,macros"]
---------------------------------------------------
-include-tagged::{doc-tests}/ClusterClientDocumentationIT.java[put-settings-request-flat-settings]
---------------------------------------------------
-<1> Whether the updated settings returned in the `ClusterUpdateSettings` should
-be in a flat format
-
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/ClusterClientDocumentationIT.java[put-settings-request-timeout]
diff --git a/docs/java-rest/high-level/indices/indices_exists.asciidoc b/docs/java-rest/high-level/indices/indices_exists.asciidoc
index 4a227db49ed8c..ee744e97ce8bd 100644
--- a/docs/java-rest/high-level/indices/indices_exists.asciidoc
+++ b/docs/java-rest/high-level/indices/indices_exists.asciidoc
@@ -23,8 +23,7 @@ include-tagged::{doc-tests}/IndicesClientDocumentationIT.java[indices-exists-req
<1> Whether to return local information or retrieve the state from master node
<2> Return result in a format suitable for humans
<3> Whether to return all default setting for each of the indices
-<4> Return settings in flat format
-<5> Controls how unavailable indices are resolved and how wildcard expressions are expanded
+<4> Controls how unavailable indices are resolved and how wildcard expressions are expanded
[[java-rest-high-indices-sync]]
==== Synchronous Execution
diff --git a/docs/java-rest/high-level/indices/put_settings.asciidoc b/docs/java-rest/high-level/indices/put_settings.asciidoc
index 49312da82a400..c305eeaa0965b 100644
--- a/docs/java-rest/high-level/indices/put_settings.asciidoc
+++ b/docs/java-rest/high-level/indices/put_settings.asciidoc
@@ -55,13 +55,6 @@ include-tagged::{doc-tests}/IndicesClientDocumentationIT.java[put-settings-setti
==== Optional Arguments
The following arguments can optionally be provided:
-["source","java",subs="attributes,callouts,macros"]
---------------------------------------------------
-include-tagged::{doc-tests}/IndicesClientDocumentationIT.java[put-settings-request-flat-settings]
---------------------------------------------------
-<1> Whether the updated settings returned in the `UpdateSettings` should
-be in a flat format
-
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests}/IndicesClientDocumentationIT.java[put-settings-request-preserveExisting]
diff --git a/docs/java-rest/high-level/search/field-caps.asciidoc b/docs/java-rest/high-level/search/field-caps.asciidoc
new file mode 100644
index 0000000000000..fef30f629ca61
--- /dev/null
+++ b/docs/java-rest/high-level/search/field-caps.asciidoc
@@ -0,0 +1,82 @@
+[[java-rest-high-field-caps]]
+=== Field Capabilities API
+
+The field capabilities API allows for retrieving the capabilities of fields across multiple indices.
+
+[[java-rest-high-field-caps-request]]
+==== Field Capabilities Request
+
+A `FieldCapabilitiesRequest` contains a list of fields to get capabilities for,
+should be returned, plus an optional list of target indices. If no indices
+are provided, the request will be executed on all indices.
+
+Note that fields parameter supports wildcard notation. For example, providing `text_*`
+will cause all fields that match the expression to be returned.
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[field-caps-request]
+--------------------------------------------------
+
+[[java-rest-high-field-caps-request-optional]]
+===== Optional arguments
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[field-caps-request-indicesOptions]
+--------------------------------------------------
+<1> Setting `IndicesOptions` controls how unavailable indices are resolved and
+how wildcard expressions are expanded.
+
+[[java-rest-high-field-caps-sync]]
+==== Synchronous Execution
+
+The `fieldCaps` method executes the request synchronously:
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[field-caps-execute]
+--------------------------------------------------
+
+[[java-rest-high-field-caps-async]]
+==== Asynchronous Execution
+
+The `fieldCapsAsync` method executes the request asynchronously,
+calling the provided `ActionListener` when the response is ready:
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[field-caps-execute-async]
+--------------------------------------------------
+<1> The `FieldCapabilitiesRequest` to execute and the `ActionListener` to use when
+the execution completes.
+
+The asynchronous method does not block and returns immediately. Once the request
+completes, the `ActionListener` is called back using the `onResponse` method
+if the execution successfully completed or using the `onFailure` method if
+it failed.
+
+A typical listener for `FieldCapabilitiesResponse` is constructed as follows:
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[field-caps-execute-listener]
+--------------------------------------------------
+<1> Called when the execution is successfully completed.
+<2> Called when the whole `FieldCapabilitiesRequest` fails.
+
+[[java-rest-high-field-caps-response]]
+==== FieldCapabilitiesResponse
+
+For each requested field, the returned `FieldCapabilitiesResponse` contains its type
+and whether or not it can be searched or aggregated on. The response also gives
+information about how each index contributes to the field's capabilities.
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[field-caps-response]
+--------------------------------------------------
+<1> The `user` field has two possible types, `keyword` and `text`.
+<2> This field only has type `keyword` in the `authors` and `contributors` indices.
+<3> Null, since the field is searchable in all indices for which it has the `keyword` type.
+<4> The `user` field is not aggregatable in the `authors` index.
\ No newline at end of file
diff --git a/docs/java-rest/high-level/search/rank-eval.asciidoc b/docs/java-rest/high-level/search/rank-eval.asciidoc
new file mode 100644
index 0000000000000..6db0dadd00ed7
--- /dev/null
+++ b/docs/java-rest/high-level/search/rank-eval.asciidoc
@@ -0,0 +1,89 @@
+[[java-rest-high-rank-eval]]
+=== Ranking Evaluation API
+
+The `rankEval` method allows to evaluate the quality of ranked search
+results over a set of search request. Given sets of manually rated
+documents for each search request, ranking evaluation performs a
+<> request and calculates
+information retrieval metrics like _mean reciprocal rank_, _precision_
+or _discounted cumulative gain_ on the returned results.
+
+[[java-rest-high-rank-eval-request]]
+==== Ranking Evaluation Request
+
+In order to build a `RankEvalRequest`, you first need to create an
+evaluation specification (`RankEvalSpec`). This specification requires
+to define the evaluation metric that is going to be calculated, as well
+as a list of rated documents per search requests. Creating the ranking
+evaluation request then takes the specification and a list of target
+indices as arguments:
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[rank-eval-request-basic]
+--------------------------------------------------
+<1> Define the metric used in the evaluation
+<2> Add rated documents, specified by index name, id and rating
+<3> Create the search query to evaluate
+<4> Combine the three former parts into a `RatedRequest`
+<5> Create the ranking evaluation specification
+<6> Create the ranking evaluation request
+
+[[java-rest-high-rank-eval-sync]]
+==== Synchronous Execution
+
+The `rankEval` method executes `RankEvalRequest`s synchronously:
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[rank-eval-execute]
+--------------------------------------------------
+
+[[java-rest-high-rank-eval-async]]
+==== Asynchronous Execution
+
+The `rankEvalAsync` method executes `RankEvalRequest`s asynchronously,
+calling the provided `ActionListener` when the response is ready.
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[rank-eval-execute-async]
+--------------------------------------------------
+<1> The `RankEvalRequest` to execute and the `ActionListener` to use when
+the execution completes
+
+The asynchronous method does not block and returns immediately. Once it is
+completed the `ActionListener` is called back using the `onResponse` method
+if the execution successfully completed or using the `onFailure` method if
+it failed.
+
+A typical listener for `RankEvalResponse` looks like:
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[rank-eval-execute-listener]
+--------------------------------------------------
+<1> Called when the execution is successfully completed.
+<2> Called when the whole `RankEvalRequest` fails.
+
+==== RankEvalResponse
+
+The `RankEvalResponse` that is returned by executing the request
+contains information about the overall evaluation score, the
+scores of each individual search request in the set of queries and
+detailed information about search hits and details about the metric
+calculation per partial result.
+
+["source","java",subs="attributes,callouts,macros"]
+--------------------------------------------------
+include-tagged::{doc-tests}/SearchDocumentationIT.java[rank-eval-response]
+--------------------------------------------------
+<1> The overall evaluation result
+<2> Partial results that are keyed by their query id
+<3> The metric score for each partial result
+<4> Rated search hits contain a fully fledged `SearchHit`
+<5> Rated search hits also contain an `Optional` rating that
+is not present if the document did not get a rating in the request
+<6> Metric details are named after the metric used in the request
+<7> After casting to the metric used in the request, the
+metric details offers insight into parts of the metric calculation
\ No newline at end of file
diff --git a/docs/java-rest/high-level/supported-apis.asciidoc b/docs/java-rest/high-level/supported-apis.asciidoc
index 29052171cddc6..1c0e09c6c079e 100644
--- a/docs/java-rest/high-level/supported-apis.asciidoc
+++ b/docs/java-rest/high-level/supported-apis.asciidoc
@@ -32,10 +32,14 @@ The Java High Level REST Client supports the following Search APIs:
* <>
* <>
* <>
+* <>
+* <>
include::search/search.asciidoc[]
include::search/scroll.asciidoc[]
include::search/multi-search.asciidoc[]
+include::search/field-caps.asciidoc[]
+include::search/rank-eval.asciidoc[]
== Miscellaneous APIs
diff --git a/docs/painless/index.asciidoc b/docs/painless/index.asciidoc
index 4898ed933363b..abfd4d4f00abe 100644
--- a/docs/painless/index.asciidoc
+++ b/docs/painless/index.asciidoc
@@ -5,39 +5,6 @@ include::../Versions.asciidoc[]
include::painless-getting-started.asciidoc[]
-// include::painless-examples.asciidoc[]
-
-// include::painless-design.asciidoc[]
-
include::painless-lang-spec.asciidoc[]
-include::painless-syntax.asciidoc[]
-
include::painless-api-reference.asciidoc[]
-
-////
-Proposed Outline (WIP)
-Getting Started with Painless
- Accessing Doc Values
- Updating Fields
- Working with Dates
- Using Regular Expressions
- Debugging Painless Scripts
-
-Example Scripts
- Using Painless in Script Fields
- Using Painless in Watches
- Using Painless in Function Score Queries
- Using Painless in Script Queries
- Using Painless When Updating Docs
- Using Painless When Reindexing
-
-How Painless Works
- Painless Architecture
- Dispatching Functions
-
-Painless Language Specification
-Painless API
-////
-
-Painless API Reference
diff --git a/docs/painless/painless-api-reference.asciidoc b/docs/painless/painless-api-reference.asciidoc
index 1bda6d890c859..54b1f20977b61 100644
--- a/docs/painless/painless-api-reference.asciidoc
+++ b/docs/painless/painless-api-reference.asciidoc
@@ -1,17 +1,13 @@
-["appendix",id="painless-api-reference"]
-= Painless API Reference
+[[painless-api-reference]]
+== Painless API Reference
-Painless has a strict whitelist for methods and
-classes to make sure that all painless scripts are secure and fast. Most of
-these methods are exposed directly from the JRE while others are part of
-Elasticsearch or Painless itself. Below is a list of all available methods
-grouped under the classes on which you can call them. Clicking on the method
-name takes you to the documentation for the method.
-
-NOTE: Methods defined in the JRE also have a `(java 9)` link which can be used
-to see the method's documentation in Java 9 while clicking on the method's name
-goes to the Java 8 documentation. Usually these aren't different but it is
-worth going to the version that matches the version of Java you are using to
-run Elasticsearch just in case.
+Painless has a strict whitelist for methods and classes to ensure all
+painless scripts are secure. Most of these methods are exposed directly
+from the Java Runtime Enviroment (JRE) while others are part of
+Elasticsearch or Painless itself. Below is a list of all available
+classes grouped with their respected methods. Clicking on the method
+name takes you to the documentation for that specific method. Methods
+defined in the JRE also have a `(java 9)` link which can be used to see
+the method's documentation in Java 9.
include::painless-api-reference/index.asciidoc[]
diff --git a/docs/painless/painless-casting.asciidoc b/docs/painless/painless-casting.asciidoc
new file mode 100644
index 0000000000000..ec4f9919bd043
--- /dev/null
+++ b/docs/painless/painless-casting.asciidoc
@@ -0,0 +1,172 @@
+[[painless-casting]]
+=== Casting
+
+Casting is the conversion of one type to another. Implicit casts are casts that
+occur automatically, such as during an assignment operation. Explicit casts are
+casts where you use the casting operator to explicitly convert one type to
+another. This is necessary during operations where the cast cannot be inferred.
+
+To cast to a new type, precede the expression by the new type enclosed in
+parentheses, for example
+`(int)x`.
+
+The following sections specify the implicit casts that can be performed and the
+explicit casts that are allowed. The only other permitted cast is casting
+a single character `String` to a `char`.
+
+*Grammar:*
+[source,ANTLR4]
+----
+cast: '(' TYPE ')' expression
+----
+
+[[numeric-casting]]
+==== Numeric Casting
+
+The following table shows the allowed implicit and explicit casts between
+numeric types. Read the table by row. To find out if you need to explicitly
+cast from type A to type B, find the row for type A and scan across to the
+column for type B.
+
+IMPORTANT: Explicit casts between numeric types can result in some data loss. A
+smaller numeric type cannot necessarily accommodate the value from a larger
+numeric type. You might also lose precision when casting from integer types
+to floating point types.
+
+|====
+| | byte | short | char | int | long | float | double
+| byte | | implicit | implicit | implicit | implicit | implicit | implicit
+| short | explicit | | explicit | implicit | implicit | implicit | implicit
+| char | explicit | explicit | | implicit | implicit | implicit | implicit
+| int | explicit | explicit | explicit | | implicit | implicit | implicit
+| long | explicit | explicit | explicit | explicit | | implicit | implicit
+| float | explicit | explicit | explicit | explicit | explicit | | implicit
+| double | explicit | explicit | explicit | explicit | explicit | explicit |
+|====
+
+
+Example(s)
+[source,Java]
+----
+int a = 1; // Declare int variable a and set it to the literal
+ // value 1
+long b = a; // Declare long variable b and set it to int variable
+ // a with an implicit cast to convert from int to long
+short c = (short)b; // Declare short variable c, explicitly cast b to a
+ // short, and assign b to c
+byte d = a; // ERROR: Casting an int to a byte requires an explicit
+ // cast
+double e = (double)a; // Explicitly cast int variable a to a double and assign
+ // it to the double variable e. The explicit cast is
+ // allowed, but it is not necessary.
+----
+
+[[reference-casting]]
+==== Reference Casting
+
+A reference type can be implicitly cast to another reference type as long as
+the type being cast _from_ is a descendant of the type being cast _to_. A
+reference type can be explicitly cast _to_ if the type being cast to is a
+descendant of the type being cast _from_.
+
+*Examples:*
+[source,Java]
+----
+List x; // Declare List variable x
+ArrayList y = new ArrayList(); // Declare ArrayList variable y and assign it a
+ // newly allocated ArrayList [1]
+x = y; // Assign Arraylist y to List x using an
+ // implicit cast
+y = (ArrayList)x; // Explicitly cast List x to an ArrayList and
+ // assign it to ArrayList y
+x = (List)y; // Set List x to ArrayList y using an explicit
+ // cast (the explicit cast is not necessary)
+y = x; // ERROR: List x cannot be implicitly cast to
+ // an ArrayList, an explicit cast is required
+Map m = y; // ERROR: Cannot implicitly or explicitly cast [2]
+ // an ArrayList to a Map, no relationship
+ // exists between the two types.
+----
+[1] `ArrayList` is a descendant of the `List` type.
+[2] `Map` is unrelated to the `List` and `ArrayList` types.
+
+[[def-type-casting]]
+==== def Type Casting
+All primitive and reference types can always be implicitly cast to
+`def`. While it is possible to explicitly cast to `def`, it is not necessary.
+
+However, it is not always possible to implicitly cast a `def` to other
+primitive and reference types. An explicit cast is required if an explicit
+cast would normally be required between the non-def types.
+
+
+*Examples:*
+[source,Java]
+----
+def x; // Declare def variable x and set it to null
+x = 3; // Set the def variable x to the literal 3 with an implicit
+ // cast from int to def
+double a = x; // Declare double variable a and set it to def variable x,
+ // which contains a double
+int b = x; // ERROR: Results in a run-time error because an explicit cast is
+ // required to cast from a double to an int
+int c = (int)x; // Declare int variable c, explicitly cast def variable x to an
+ // int, and assign x to c
+----
+
+[[boxing-unboxing]]
+==== Boxing and Unboxing
+
+Boxing is where a cast is used to convert a primitive type to its corresponding
+reference type. Unboxing is the reverse, converting a reference type to the
+corresponding primitive type.
+
+There are two places Painless performs implicit boxing and unboxing:
+
+* When you call methods, Painless automatically boxes and unboxes arguments
+so you can specify either primitive types or their corresponding reference
+types.
+* When you use the `def` type, Painless automatically boxes and unboxes as
+needed when converting to and from `def`.
+
+The casting operator does not support any way to explicitly box a primitive
+type or unbox a reference type.
+
+If a primitive type needs to be converted to a reference type, the Painless
+reference type API supports methods that can do that. However, under normal
+circumstances this should not be necessary.
+
+*Examples:*
+[source,Java]
+----
+Integer x = 1; // ERROR: not a legal implicit cast
+Integer y = (Integer)1; // ERROR: not a legal explicit cast
+int a = new Integer(1); // ERROR: not a legal implicit cast
+int b = (int)new Integer(1); // ERROR: not a legal explicit cast
+----
+
+[[promotion]]
+==== Promotion
+
+Promotion is where certain operations require types to be either a minimum
+numerical type or for two (or more) types to be equivalent.
+The documentation for each operation that has these requirements
+includes promotion tables that describe how this is handled.
+
+When an operation promotes a type or types, the resultant type
+of the operation is the promoted type. Types can be promoted to def
+at compile-time; however, at run-time, the resultant type will be the
+promotion of the types the `def` is representing.
+
+*Examples:*
+[source,Java]
+----
+2 + 2.0 // Add the literal int 2 and the literal double 2.0. The literal
+ // 2 is promoted to a double and the resulting value is a double.
+
+def x = 1; // Declare def variable x and set it to the literal int 1 through
+ // an implicit cast
+x + 2.0F // Add def variable x and the literal float 2.0.
+ // At compile-time the types are promoted to def.
+ // At run-time the types are promoted to float.
+----
diff --git a/docs/painless/painless-comments.asciidoc b/docs/painless/painless-comments.asciidoc
new file mode 100644
index 0000000000000..588e464d97f78
--- /dev/null
+++ b/docs/painless/painless-comments.asciidoc
@@ -0,0 +1,51 @@
+[[painless-comments]]
+=== Comments
+
+Use the `//` token anywhere on a line to specify a single-line comment. All
+characters from the `//` token to the end of the line are ignored. Use an
+opening `/*` token and a closing `*/` token to specify a multi-line comment.
+Multi-line comments can start anywhere on a line, and all characters in between
+the `/*` token and `*/` token are ignored. Comments can be included anywhere
+within a script.
+
+*Grammar*
+[source,ANTLR4]
+----
+SINGLE_LINE_COMMENT: '//' .*? [\n\r];
+MULTI_LINE_COMMENT: '/*' .*? '*/';
+----
+
+*Examples*
+
+* Single-line comments.
++
+[source,Painless]
+----
+// single-line comment
+
+int value; // single-line comment
+----
++
+* Multi-line comments.
++
+[source,Painless]
+----
+/* multi-
+ line
+ comment */
+
+int value; /* multi-
+ line
+ comment */ value = 0;
+
+int value; /* multi-line
+ comment */
+
+/* multi-line
+ comment */ int value;
+
+int value; /* multi-line
+ comment */ value = 0;
+
+int value; /* multi-line comment */ value = 0;
+----
diff --git a/docs/painless/painless-description.asciidoc b/docs/painless/painless-description.asciidoc
index 874eab5632cfb..dfaf66ca26d4b 100644
--- a/docs/painless/painless-description.asciidoc
+++ b/docs/painless/painless-description.asciidoc
@@ -2,7 +2,7 @@ _Painless_ is a simple, secure scripting language designed specifically for use
with Elasticsearch. It is the default scripting language for Elasticsearch and
can safely be used for inline and stored scripts. For a detailed description of
the Painless syntax and language features, see the
-{painless}/painless-specification.html[Painless Language Specification].
+{painless}/painless-lang-spec.html[Painless Language Specification].
[[painless-features]]
You can use Painless anywhere scripts can be used in Elasticsearch. Painless
diff --git a/docs/painless/painless-execute-script.asciidoc b/docs/painless/painless-execute-script.asciidoc
new file mode 100644
index 0000000000000..7997c87e3e45f
--- /dev/null
+++ b/docs/painless/painless-execute-script.asciidoc
@@ -0,0 +1,53 @@
+[[painless-execute-api]]
+=== Painless execute API
+
+The Painless execute API allows an arbitrary script to be executed and a result to be returned.
+
+[[painless-execute-api-parameters]]
+.Parameters
+[options="header"]
+|======
+| Name | Required | Default | Description
+| `script` | yes | - | The script to execute
+| `context` | no | `execute_api_script` | The context the script should be executed in.
+|======
+
+==== Contexts
+
+Contexts control how scripts are executed, what variables are available at runtime and what the return type is.
+
+===== Painless test script context
+
+The `painless_test` context executes scripts as is and do not add any special parameters.
+The only variable that is available is `params`, which can be used to access user defined values.
+The result of the script is always converted to a string.
+If no context is specified then this context is used by default.
+
+==== Example
+
+Request:
+
+[source,js]
+----------------------------------------------------------------
+POST /_scripts/painless/_execute
+{
+ "script": {
+ "source": "params.count / params.total",
+ "params": {
+ "count": 100.0,
+ "total": 1000.0
+ }
+ }
+}
+----------------------------------------------------------------
+// CONSOLE
+
+Response:
+
+[source,js]
+--------------------------------------------------
+{
+ "result": "0.1"
+}
+--------------------------------------------------
+// TESTRESPONSE
\ No newline at end of file
diff --git a/docs/painless/painless-syntax.asciidoc b/docs/painless/painless-general-syntax.asciidoc
similarity index 72%
rename from docs/painless/painless-syntax.asciidoc
rename to docs/painless/painless-general-syntax.asciidoc
index c68ed5168c01b..114bff80bfa70 100644
--- a/docs/painless/painless-syntax.asciidoc
+++ b/docs/painless/painless-general-syntax.asciidoc
@@ -1,7 +1,6 @@
-[[painless-syntax]]
-=== Painless Syntax
+[[painless-general-syntax]]
+=== General Syntax
-[float]
[[control-flow]]
==== Control flow
@@ -17,7 +16,6 @@ for (item : list) {
}
---------------------------------------------------------
-[float]
[[functions]]
==== Functions
@@ -32,7 +30,6 @@ if (isNegative(someVar)) {
}
---------------------------------------------------------
-[float]
[[lambda-expressions]]
==== Lambda expressions
Lambda expressions and method references work the same as in https://docs.oracle.com/javase/tutorial/java/javaOO/lambdaexpressions.html[Java].
@@ -49,7 +46,6 @@ list.sort(Integer::compare);
You can make method references to functions within the script with `this`,
for example `list.sort(this::mycompare)`.
-[float]
[[patterns]]
==== Patterns
@@ -62,7 +58,6 @@ are always constants and compiled efficiently a single time.
Pattern p = /[aeiou]/
---------------------------------------------------------
-[float]
[[pattern-flags]]
===== Pattern flags
@@ -84,34 +79,3 @@ Pattern class] using these characters:
|`u` | UNICODE_CASE | `'Ɛ' ==~ /ɛ/iu`
|`x` | COMMENTS (aka extended) | `'a' ==~ /a #comment/x`
|=======================================================================
-
-[float]
-[[painless-deref]]
-==== Dereferences
-
-Like lots of languages, Painless uses `.` to reference fields and call methods:
-
-[source,painless]
----------------------------------------------------------
-String foo = 'foo';
-TypeWithGetterOrPublicField bar = new TypeWithGetterOrPublicField()
-return foo.length() + bar.x
----------------------------------------------------------
-
-Like Groovy, Painless uses `?.` to perform null-safe references, with the
-result being `null` if the left hand side is `null`:
-
-[source,painless]
----------------------------------------------------------
-String foo = null;
-return foo?.length() // Returns null
----------------------------------------------------------
-
-Unlike Groovy, Painless doesn't support writing to `null` values with this
-operator:
-
-[source,painless]
----------------------------------------------------------
-TypeWithSetterOrPublicField foo = null;
-foo?.x = 'bar' // Compile error
----------------------------------------------------------
diff --git a/docs/painless/painless-getting-started.asciidoc b/docs/painless/painless-getting-started.asciidoc
index 8cf163d55d7b9..2cf91666ba48d 100644
--- a/docs/painless/painless-getting-started.asciidoc
+++ b/docs/painless/painless-getting-started.asciidoc
@@ -389,3 +389,5 @@ dispatch *feels* like it'd add a ton of complexity which'd make maintenance and
other improvements much more difficult.
include::painless-debugging.asciidoc[]
+
+include::painless-execute-script.asciidoc[]
diff --git a/docs/painless/painless-identifiers.asciidoc b/docs/painless/painless-identifiers.asciidoc
new file mode 100644
index 0000000000000..17073e3d4c415
--- /dev/null
+++ b/docs/painless/painless-identifiers.asciidoc
@@ -0,0 +1,29 @@
+[[painless-identifiers]]
+=== Identifiers
+
+Specify identifiers to <>, <>, and
+<> variables, <>, and
+<>. <> and
+<> cannot be used as identifiers.
+
+*Grammar*
+[source,ANTLR4]
+----
+ID: [_a-zA-Z] [_a-zA-Z-0-9]*;
+----
+
+*Examples*
+
+* Variations of identifiers.
++
+[source,Painless]
+----
+a
+Z
+id
+list
+list0
+MAP25
+_map25
+Map_25
+----
diff --git a/docs/painless/painless-keywords.asciidoc b/docs/painless/painless-keywords.asciidoc
new file mode 100644
index 0000000000000..cb3bafbd20f13
--- /dev/null
+++ b/docs/painless/painless-keywords.asciidoc
@@ -0,0 +1,13 @@
+[[painless-keywords]]
+=== Keywords
+
+The keywords in the table below are reserved for built-in language
+features. These keywords cannot be used as
+<> or <>.
+
+[cols="^1,^1,^1,^1,^1"]
+|====
+| if | else | while | do | for
+| in | continue | break | return | new
+| try | catch | throw | this | instanceof
+|====
diff --git a/docs/painless/painless-lang-spec.asciidoc b/docs/painless/painless-lang-spec.asciidoc
index 6544b0ad26495..ba6595000ae2f 100644
--- a/docs/painless/painless-lang-spec.asciidoc
+++ b/docs/painless/painless-lang-spec.asciidoc
@@ -1,73 +1,36 @@
-[[painless-specification]]
+[[painless-lang-spec]]
== Painless Language Specification
-Painless uses a Java-style syntax that is similar to Groovy. In fact, most
-Painless scripts are also valid Groovy, and simple Groovy scripts are typically
-valid Painless. This specification assumes you have at least a passing
-familiarity with Java and related languages.
-
-Painless is essentially a subset of Java with some additional scripting
-language features that make scripts easier to write. However, there are some
-important differences, particularly with the casting model. For more detailed
+Painless is a scripting language designed for security and performance.
+Painless syntax is similar to Java syntax along with some additional
+features such as dynamic typing, Map and List accessor shortcuts, and array
+initializers. As a direct comparison to Java, there are some important
+differences, especially related to the casting model. For more detailed
conceptual information about the basic constructs that Java and Painless share,
refer to the corresponding topics in the
https://docs.oracle.com/javase/specs/jls/se8/html/index.html[Java Language
Specification].
Painless scripts are parsed and compiled using the http://www.antlr.org/[ANTLR4]
-and http://asm.ow2.org/[ASM] libraries. Painless scripts are compiled directly
-into Java byte code and executed against a standard Java Virtual Machine. This
-specification uses ANTLR4 grammar notation to describe the allowed syntax.
+and http://asm.ow2.org/[ASM] libraries. Scripts are compiled directly
+into Java Virtual Machine (JVM) byte code and executed against a standard JVM.
+This specification uses ANTLR4 grammar notation to describe the allowed syntax.
However, the actual Painless grammar is more compact than what is shown here.
-[float]
-[[comments]]
-==== Comments
-
-Painless supports both single-line and multi-line comments. You can include
-comments anywhere within a script.
-
-Single-line comments are preceded by two slashes: `// comment`. They can be
-placed anywhere on a line. All characters from the two slashes to the end of
-the line are ignored.
-
-Multi-line comments are preceded by a slash-star `/*` and closed by
-star-slash `*/`. Multi-line comments can start anywhere on a line. All
-characters from the opening `/*` to the closing `*/` are ignored.
-
-*Examples:*
-
-[source,Java]
-----
-// single-line comment
-
- // single-line comment
+include::painless-comments.asciidoc[]
-/* multi-
- line
- comment */
+include::painless-keywords.asciidoc[]
- /* multi-line
- comment */
+include::painless-literals.asciidoc[]
- /* multi-line comment */
-----
+include::painless-identifiers.asciidoc[]
-[float]
-[[keywords]]
-==== Keywords
+include::painless-variables.asciidoc[]
-Painless reserves the following keywords for built-in language features.
-These keywords cannot be used in other contexts, such as identifiers.
+include::painless-types.asciidoc[]
-[cols="^1,^1,^1,^1,^1"]
-|====
-| if | else | while | do | for
-| in | continue | break | return | new
-| try | catch | throw | this | instanceof
-|====
+include::painless-casting.asciidoc[]
-include::painless-literals.asciidoc[]
-include::painless-variables.asciidoc[]
-include::painless-types.asciidoc[]
include::painless-operators.asciidoc[]
+
+include::painless-general-syntax.asciidoc[]
diff --git a/docs/painless/painless-literals.asciidoc b/docs/painless/painless-literals.asciidoc
index 43c5eb82f96a2..441cb264f1e15 100644
--- a/docs/painless/painless-literals.asciidoc
+++ b/docs/painless/painless-literals.asciidoc
@@ -1,94 +1,142 @@
-[[literals]]
+[[painless-literals]]
=== Literals
-Literals are values that you can specify directly in Painless scripts.
+Use literals to specify different types of values directly in a script.
[[integers]]
==== Integers
-Specify integer literals in decimal, octal, or hex notation. Use the following
-single letter designations to specify the primitive type: `l` for `long`, `f`
-for `float`, and `d` for `double`. If not specified, the type defaults to
-`int` (with the exception of certain assignments described later).
+Use integer literals to specify an integer value in decimal, octal, or hex
+notation of the <> `int`, `long`, `float`,
+or `double`. Use the following single letter designations to specify the
+<>: `l` or `L` for `long`, `f` or `F` for
+`float`, and `d` or `D` for `double`. If not specified, the type defaults to
+`int`. Use `0` as a prefix to specify an integer literal as octal, and use
+`0x` or `0X` as a prefix to specify an integer literal as hex.
-*Grammar:*
+*Grammar*
[source,ANTLR4]
----
INTEGER: '-'? ( '0' | [1-9] [0-9]* ) [lLfFdD]?;
-OCTAL: '-'? '0' [0-7]+ [lL]?;
-HEX: '-'? '0' [xX] [0-9a-fA-F]+ [lL]?;
+OCTAL: '-'? '0' [0-7]+ [lL]?;
+HEX: '-'? '0' [xX] [0-9a-fA-F]+ [lL]?;
----
-*Examples:*
-[source,Java]
+*Examples*
+
+* Integer literals.
++
+[source,Painless]
----
-0 // integer literal of 0
-0D // double literal of 0.0
-1234L // long literal of 1234
--90F // float literal of -90.0
--022 // integer literal of -18 specified in octal
-0xF2A // integer literal of 3882
+<1> 0
+<2> 0D
+<3> 1234L
+<4> -90f
+<5> -022
+<6> 0xF2A
----
-
-[[floating-point-values]]
-==== Floating Point Values
-
-Specify floating point literals using the following single letter designations
-for the primitive type: `f` for `float` and `d` for `double`.
-If not specified, the type defaults to `double`.
-
-*Grammar:*
++
+<1> `int 0`
+<2> `double 0.0`
+<3> `long 1234`
+<4> `float -90.0`
+<5> `int -18` in octal
+<6> `int 3882` in hex
+
+[[floats]]
+==== Floats
+
+Use floating point literals to specify a floating point value of the
+<> `float` or `double`. Use the following
+single letter designations to specify the <>:
+`f` or `F` for `float` and `d` or `D` for `double`. If not specified, the type defaults
+to `double`.
+
+*Grammar*
[source,ANTLR4]
----
-DECIMAL: '-'? ( '0' | [1-9] [0-9]* ) (DOT [0-9]+)? ( [eE] [+\-]? [0-9]+ )? [fFdD]?;
+DECIMAL: '-'? ( '0' | [1-9] [0-9]* ) (DOT [0-9]+)? EXPONENT? [fFdD]?;
+EXPONENT: ( [eE] [+\-]? [0-9]+ );
----
-*Examples:*
-[source,Java]
+*Examples*
+
+* Floating point literals.
++
+[source,Painless]
----
-0.0 // double value of 0.0
-1E6 // double value of 1000000
-0.977777 // double value of 0.97777
--126.34 // double value of -126.34
-89.9F // float value of 89.9
+<1> 0.0
+<2> 1E6
+<3> 0.977777
+<4> -126.34
+<5> 89.9F
----
++
+<1> `double 0.0`
+<2> `double 1000000.0` in exponent notation
+<3> `double 0.977777`
+<4> `double -126.34`
+<5> `float 89.9`
[[strings]]
==== Strings
-Specify literal string with either single or double quotes. In double-quoted
-literal strings, you can escape double-quotes with a backslash to include them
-in the string. Similarly, you escape single quotes with a backslash in
-single-quoted literal strings. Backslashes themselves also need to be
-escaped with a backslash.
+Use string literals to specify <> values with
+either single-quotes or double-quotes. Use a `\"` token to include a
+double-quote as part of a double-quoted string literal. Use a `\'` token to
+include a single-quote as part of a single-quoted string literal. Use a `\\`
+token to include a backslash as part of any string literal.
-*Grammar:*
+*Grammar*
[source,ANTLR4]
----
-STRING: ( '"' ( '\\"' | '\\\\' | ~[\\"] )*? '"' ) | ( '\'' ( '\\\'' | '\\\\' | ~[\\'] )*? '\'' );
+STRING: ( '"' ( '\\"' | '\\\\' | ~[\\"] )*? '"' )
+ | ( '\'' ( '\\\'' | '\\\\' | ~[\\'] )*? '\'' );
----
-*Examples:*
-[source,Java]
+*Examples*
+
+* String literals using single-quotes.
++
+[source,Painless]
----
-"double-quoted String literal"
-'single-quoted String literal'
-"\"double-quoted String with escaped double-quotes\" and backslash: \\"
-'\'single-quoted String with escaped single-quotes\' and backslash \\'
-"double-quoted String with non-escaped 'single-quotes'"
-'single-quoted String with non-escaped "double-quotes"'
+'single-quoted string literal'
+'\'single-quoted with escaped single-quotes\' and backslash \\'
+'single-quoted with non-escaped "double-quotes"'
----
++
+* String literals using double-quotes.
++
+[source,Painless]
+----
+"double-quoted string literal"
+"\"double-quoted with escaped double-quotes\" and backslash: \\"
+"double-quoted with non-escaped 'single-quotes'"
+----
+
+[[characters]]
+==== Characters
-[[char]]
-===== Char
+Use the <> to convert string literals or
+<> values into <> values.
+<> values converted into
+<> values must be exactly one character in length
+or an error will occur.
-You cannot directly specify character literals in Painless. However, you can
-cast single-character strings to char. Attempting to cast a multi-character
-string to a char throws an error.
+*Examples*
-*Examples:*
-[source,Java]
+* Casting string literals into <> values.
++
+[source,Painless]
----
(char)"C"
(char)'c'
-----
\ No newline at end of file
+----
++
+* Casting a <> value into a <> value.
++
+[source,Painless]
+----
+String s = "s";
+char c = (char)s;
+----
diff --git a/docs/painless/painless-operators.asciidoc b/docs/painless/painless-operators.asciidoc
index 0d5135022ad90..915d811fa441b 100644
--- a/docs/painless/painless-operators.asciidoc
+++ b/docs/painless/painless-operators.asciidoc
@@ -1,3 +1,4 @@
+[[painless-operators]]
=== Operators
The following is a table of the available operators in Painless. Each operator will have further information and examples outside of the table. Many operators will have a promotion table as described by the documentation on promotion [MARK].
@@ -703,6 +704,7 @@ e = ~d; // sets e the negation of d
The cast operator can be used to explicitly convert one type to another. See casting [MARK] for more information.
+[[constructor-call]]
==== Constructor Call
A constructor call is a special type of method call [MARK] used to allocate a reference type instance using the new operator. The format is the new operator followed by a type, an opening parenthesis, arguments if any, and a closing parenthesis. Arguments are a series of zero-to-many expressions delimited by commas. Auto-boxing and auto-unboxing will be applied automatically for arguments passed into a constructor call. See boxing and unboxing [MARK] for more information on this topic. Constructor argument types can always be resolved at run-time; if appropriate type conversions (casting) cannot be applied an error will occur. Once a reference type instance has been allocated, its members may be used as part of other expressions.
diff --git a/docs/painless/painless-types.asciidoc b/docs/painless/painless-types.asciidoc
index 9e5077503b4a8..9d575a2069ae3 100644
--- a/docs/painless/painless-types.asciidoc
+++ b/docs/painless/painless-types.asciidoc
@@ -1,5 +1,5 @@
-[[types]]
-=== Data Types
+[[painless-types]]
+=== Types
Painless supports both dynamic and static types. Static types are split into
_primitive types_ and _reference types_.
@@ -267,176 +267,3 @@ def[] da = new def[] {i, l, f*d, s}; // Declare def array da and set it to
// a def array with a size of 4 and the
// values i, l, f*d, and s
----
-
-[[casting]]
-=== Casting
-
-Casting is the conversion of one type to another. Implicit casts are casts that
-occur automatically, such as during an assignment operation. Explicit casts are
-casts where you use the casting operator to explicitly convert one type to
-another. This is necessary during operations where the cast cannot be inferred.
-
-To cast to a new type, precede the expression by the new type enclosed in
-parentheses, for example
-`(int)x`.
-
-The following sections specify the implicit casts that can be performed and the
-explicit casts that are allowed. The only other permitted cast is casting
-a single character `String` to a `char`.
-
-*Grammar:*
-[source,ANTLR4]
-----
-cast: '(' TYPE ')' expression
-----
-
-[[numeric-casting]]
-==== Numeric Casting
-
-The following table shows the allowed implicit and explicit casts between
-numeric types. Read the table by row. To find out if you need to explicitly
-cast from type A to type B, find the row for type A and scan across to the
-column for type B.
-
-IMPORTANT: Explicit casts between numeric types can result in some data loss. A
-smaller numeric type cannot necessarily accommodate the value from a larger
-numeric type. You might also lose precision when casting from integer types
-to floating point types.
-
-|====
-| | byte | short | char | int | long | float | double
-| byte | | implicit | implicit | implicit | implicit | implicit | implicit
-| short | explicit | | explicit | implicit | implicit | implicit | implicit
-| char | explicit | explicit | | implicit | implicit | implicit | implicit
-| int | explicit | explicit | explicit | | implicit | implicit | implicit
-| long | explicit | explicit | explicit | explicit | | implicit | implicit
-| float | explicit | explicit | explicit | explicit | explicit | | implicit
-| double | explicit | explicit | explicit | explicit | explicit | explicit |
-|====
-
-
-Example(s)
-[source,Java]
-----
-int a = 1; // Declare int variable a and set it to the literal
- // value 1
-long b = a; // Declare long variable b and set it to int variable
- // a with an implicit cast to convert from int to long
-short c = (short)b; // Declare short variable c, explicitly cast b to a
- // short, and assign b to c
-byte d = a; // ERROR: Casting an int to a byte requires an explicit
- // cast
-double e = (double)a; // Explicitly cast int variable a to a double and assign
- // it to the double variable e. The explicit cast is
- // allowed, but it is not necessary.
-----
-
-[[reference-casting]]
-==== Reference Casting
-
-A reference type can be implicitly cast to another reference type as long as
-the type being cast _from_ is a descendant of the type being cast _to_. A
-reference type can be explicitly cast _to_ if the type being cast to is a
-descendant of the type being cast _from_.
-
-*Examples:*
-[source,Java]
-----
-List x; // Declare List variable x
-ArrayList y = new ArrayList(); // Declare ArrayList variable y and assign it a
- // newly allocated ArrayList [1]
-x = y; // Assign Arraylist y to List x using an
- // implicit cast
-y = (ArrayList)x; // Explicitly cast List x to an ArrayList and
- // assign it to ArrayList y
-x = (List)y; // Set List x to ArrayList y using an explicit
- // cast (the explicit cast is not necessary)
-y = x; // ERROR: List x cannot be implicitly cast to
- // an ArrayList, an explicit cast is required
-Map m = y; // ERROR: Cannot implicitly or explicitly cast [2]
- // an ArrayList to a Map, no relationship
- // exists between the two types.
-----
-[1] `ArrayList` is a descendant of the `List` type.
-[2] `Map` is unrelated to the `List` and `ArrayList` types.
-
-[[def-type-casting]]
-==== def Type Casting
-All primitive and reference types can always be implicitly cast to
-`def`. While it is possible to explicitly cast to `def`, it is not necessary.
-
-However, it is not always possible to implicitly cast a `def` to other
-primitive and reference types. An explicit cast is required if an explicit
-cast would normally be required between the non-def types.
-
-
-*Examples:*
-[source,Java]
-----
-def x; // Declare def variable x and set it to null
-x = 3; // Set the def variable x to the literal 3 with an implicit
- // cast from int to def
-double a = x; // Declare double variable a and set it to def variable x,
- // which contains a double
-int b = x; // ERROR: Results in a run-time error because an explicit cast is
- // required to cast from a double to an int
-int c = (int)x; // Declare int variable c, explicitly cast def variable x to an
- // int, and assign x to c
-----
-
-[[boxing-unboxing]]
-==== Boxing and Unboxing
-
-Boxing is where a cast is used to convert a primitive type to its corresponding
-reference type. Unboxing is the reverse, converting a reference type to the
-corresponding primitive type.
-
-There are two places Painless performs implicit boxing and unboxing:
-
-* When you call methods, Painless automatically boxes and unboxes arguments
-so you can specify either primitive types or their corresponding reference
-types.
-* When you use the `def` type, Painless automatically boxes and unboxes as
-needed when converting to and from `def`.
-
-The casting operator does not support any way to explicitly box a primitive
-type or unbox a reference type.
-
-If a primitive type needs to be converted to a reference type, the Painless
-reference type API supports methods that can do that. However, under normal
-circumstances this should not be necessary.
-
-*Examples:*
-[source,Java]
-----
-Integer x = 1; // ERROR: not a legal implicit cast
-Integer y = (Integer)1; // ERROR: not a legal explicit cast
-int a = new Integer(1); // ERROR: not a legal implicit cast
-int b = (int)new Integer(1); // ERROR: not a legal explicit cast
-----
-
-[[promotion]]
-==== Promotion
-
-Promotion is where certain operations require types to be either a minimum
-numerical type or for two (or more) types to be equivalent.
-The documentation for each operation that has these requirements
-includes promotion tables that describe how this is handled.
-
-When an operation promotes a type or types, the resultant type
-of the operation is the promoted type. Types can be promoted to def
-at compile-time; however, at run-time, the resultant type will be the
-promotion of the types the `def` is representing.
-
-*Examples:*
-[source,Java]
-----
-2 + 2.0 // Add the literal int 2 and the literal double 2.0. The literal
- // 2 is promoted to a double and the resulting value is a double.
-
-def x = 1; // Declare def variable x and set it to the literal int 1 through
- // an implicit cast
-x + 2.0F // Add def variable x and the literal float 2.0.
- // At compile-time the types are promoted to def.
- // At run-time the types are promoted to float.
-----
diff --git a/docs/painless/painless-variables.asciidoc b/docs/painless/painless-variables.asciidoc
index 2177b0bb91ba8..9756676a08b5b 100644
--- a/docs/painless/painless-variables.asciidoc
+++ b/docs/painless/painless-variables.asciidoc
@@ -1,123 +1,130 @@
-[[variables]]
+[[painless-variables]]
=== Variables
-Variables in Painless must be declared and can be statically or <>.
-
-[[variable-identifiers]]
-==== Variable Identifiers
-
-Specify variable identifiers using the following grammar. Variable identifiers
-must start with a letter or underscore. You cannot use <> or
-<> as identifiers.
-
-*Grammar:*
-[source,ANTLR4]
-----
-ID: [_a-zA-Z] [_a-zA-Z-0-9]*;
-----
-
-*Examples:*
-[source,Java]
-----
-_
-a
-Z
-id
-list
-list0
-MAP25
-_map25
-----
-
-[[variable-declaration]]
-==== Variable Declaration
-
-Variables must be declared before you use them. The format is `type-name
-identifier-name`. To declare multiple variables of the same type, specify a
-comma-separated list of identifier names. You can immediately assign a value to
-a variable when you declare it.
-
-*Grammar:*
+<> variables to <> values for
+<> in expressions. Specify variables as a
+<>, <>, or
+<>. Variable operations follow the structure of a
+standard JVM in relation to instruction execution and memory usage.
+
+[[declaration]]
+==== Declaration
+
+Declare variables before use with the format of <>
+<>. Specify a comma-separated list of
+<> following the <>
+to declare multiple variables in a single statement. Use an
+<> statement combined with a declaration statement to
+immediately assign a value to a variable. Variables not immediately assigned a
+value will have a default value assigned implicitly based on the
+<>.
+
+*Grammar*
[source,ANTLR4]
----
+declaration : type ID assignment? (',' ID assignment?)*;
type: ID ('[' ']')*;
-declaration : type ID (',' ID)*;
+assignment: '=' expression;
----
-*Examples:*
-[source,Java]
+*Examples*
+
+* Different variations of variable declaration.
++
+[source,Painless]
----
-int x; // Declare a variable with type int and id x
-List y; // Declare a variable with type List and id y
-int x, y, z; // Declare variables with type int and ids x, y, and z
-def[] d; // Declare the variable d with type def[]
-int i = 10; // Declare the int variable i and set it to the int literal 10
+<1> int x;
+<2> List y;
+<3> int x, y, z;
+<4> def[] d;
+<5> int i = 10;
----
++
+<1> declare a variable of type `int` and identifier `x`
+<2> declare a variable of type `List` and identifier `y`
+<3> declare three variables of type `int` and identifiers `x`, `y`, `z`
+<4> declare a variable of type `def[]` and identifier `d`
+<5> declare a variable of type `int` and identifier `i`;
+ assign the integer literal `10` to `i`
-[[variable-assignment]]
-==== Variable Assignment
+[[assignment]]
+==== Assignment
-Use the equals operator (`=`) to assign a value to a variable. The format is
-`identifier-name = value`. Any value expression can be assigned to any variable
-as long as the types match or the expression's type can be implicitly cast to
-the variable's type. An error occurs if the types do not match.
+Use the `equals` operator (`=`) to assign a value to a variable. Any expression
+that produces a value can be assigned to any variable as long as the
+<> are the same or the resultant
+<> can be implicitly <> to
+the variable <>. Otherwise, an error will occur.
+<> values are shallow-copied when assigned.
-*Grammar:*
+*Grammar*
[source,ANTLR4]
----
assignment: ID '=' expression
----
-
-*Examples:*
-
-Assigning a literal of the appropriate type directly to a declared variable.
-
-[source,Java]
-----
-int i; // Declare an int i
-i = 10; // Set the int i to the int literal 10
-----
-
-Immediately assigning a value when declaring a variable.
-
-[source,Java]
-----
-int i = 10; // Declare the int variable i and set it the int literal 1
-double j = 2.0; // Declare the double variable j and set it to the double
- // literal 2.0
-----
-
-Assigning a variable of one primitive type to another variable of the same
-type.
-
-[source,Java]
-----
-int i = 10; // Declare the int variable i and set it to the int literal 10
-int j = i; // Declare the int variable j and set it to the int variable i
-----
-
-Assigning a reference type to a new heap allocation with the `new` operator.
-
-[source,Java]
-----
-ArrayList l = new ArrayList(); // Declare an ArrayList variable l and set it
- // to a newly allocated ArrayList
-Map m = new HashMap(); // Declare a Map variable m and set it
- // to a newly allocated HashMap
-----
-
-Assigning a variable of one reference type to another variable of the same type.
-
-[source,Java]
-----
-List l = new ArrayList(); // Declare List variable l and set it a newly
- // allocated ArrayList
-List k = l; // Declare List variable k and set it to the
- // value of the List variable l
-List m; // Declare List variable m and set it the
- // default value null
-m = k; // Set the value of List variable m to the value
- // of List variable k
-----
+*Examples*
+
+* Variable assignment with an <>.
++
+[source,Painless]
+----
+<1> int i;
+<2> i = 10;
+----
++
+<1> declare `int i`
+<2> assign `10` to `i`
++
+* <> combined with immediate variable assignment.
++
+[source,Painless]
+----
+<1> int i = 10;
+<2> double j = 2.0;
+----
++
+<1> declare `int i`; assign `10` to `i`
+<2> declare `double j`; assign `2.0` to `j`
++
+* Assignment of one variable to another using
+<>.
++
+[source,Painless]
+----
+<1> int i = 10;
+<2> int j = i;
+----
++
+<1> declare `int i`; assign `10` to `i`
+<2> declare `int j`; assign `j` to `i`
++
+* Assignment with <> using the
+<>.
++
+[source,Painless]
+----
+<1> ArrayList l = new ArrayList();
+<2> Map m = new HashMap();
+----
++
+<1> declare `ArrayList l`; assign a newly-allocated `Arraylist` to `l`
+<2> declare `Map m`; assign a newly-allocated `HashMap` to `m`
+ with an implicit cast to `Map`
++
+* Assignment of one variable to another using
+<>.
++
+[source,Painless]
+----
+<1> List l = new ArrayList();
+<2> List k = l;
+<3> List m;
+<4> m = k;
+----
++
+<1> declare `List l`; assign a newly-allocated `Arraylist` to `l`
+ with an implicit cast to `List`
+<2> declare `List k`; assign a shallow-copy of `l` to `k`
+<3> declare `List m`;
+<4> assign a shallow-copy of `k` to `m`
diff --git a/docs/plugins/analysis.asciidoc b/docs/plugins/analysis.asciidoc
index 3c3df021de5cb..c09c48640ea3d 100644
--- a/docs/plugins/analysis.asciidoc
+++ b/docs/plugins/analysis.asciidoc
@@ -53,6 +53,7 @@ A number of analysis plugins have been contributed by our community:
* https://github.com/duydo/elasticsearch-analysis-vietnamese[Vietnamese Analysis Plugin] (by Duy Do)
* https://github.com/ofir123/elasticsearch-network-analysis[Network Addresses Analysis Plugin] (by Ofir123)
* https://github.com/medcl/elasticsearch-analysis-string2int[String2Integer Analysis Plugin] (by Medcl)
+* https://github.com/ZarHenry96/elasticsearch-dandelion-plugin[Dandelion Analysis Plugin] (by ZarHenry96)
include::analysis-icu.asciidoc[]
diff --git a/docs/plugins/discovery-azure-classic.asciidoc b/docs/plugins/discovery-azure-classic.asciidoc
index f11b4018bf5d1..c56991b8f507f 100644
--- a/docs/plugins/discovery-azure-classic.asciidoc
+++ b/docs/plugins/discovery-azure-classic.asciidoc
@@ -372,6 +372,8 @@ This command should give you a JSON result:
"cluster_uuid" : "AT69_T_DTp-1qgIJlatQqA",
"version" : {
"number" : "{version}",
+ "build_flavor" : "oss",
+ "build_type" : "zip",
"build_hash" : "f27399d",
"build_date" : "2016-03-30T09:51:41.449Z",
"build_snapshot" : false,
diff --git a/docs/reference/aggregations/bucket/datehistogram-aggregation.asciidoc b/docs/reference/aggregations/bucket/datehistogram-aggregation.asciidoc
index 30ea2832a700e..c2d1614ad6e56 100644
--- a/docs/reference/aggregations/bucket/datehistogram-aggregation.asciidoc
+++ b/docs/reference/aggregations/bucket/datehistogram-aggregation.asciidoc
@@ -27,11 +27,13 @@ POST /sales/_search?size=0
// CONSOLE
// TEST[setup:sales]
-Available expressions for interval: `year`, `quarter`, `month`, `week`, `day`, `hour`, `minute`, `second`
+Available expressions for interval: `year` (`1y`), `quarter` (`1q`), `month` (`1M`), `week` (`1w`),
+`day` (`1d`), `hour` (`1h`), `minute` (`1m`), `second` (`1s`)
Time values can also be specified via abbreviations supported by <> parsing.
Note that fractional time values are not supported, but you can address this by shifting to another
-time unit (e.g., `1.5h` could instead be specified as `90m`).
+time unit (e.g., `1.5h` could instead be specified as `90m`). Also note that time intervals larger than
+than days do not support arbitrary values but can only be one unit large (e.g. `1y` is valid, `2y` is not).
[source,js]
--------------------------------------------------
diff --git a/docs/reference/cat.asciidoc b/docs/reference/cat.asciidoc
index 3dff5abc52d9a..7a2262b7962bb 100644
--- a/docs/reference/cat.asciidoc
+++ b/docs/reference/cat.asciidoc
@@ -93,8 +93,8 @@ Responds with:
// TESTRESPONSE[s/9300 27 sLBaIGK/\\d+ \\d+ .+/ _cat]
You can also request multiple columns using simple wildcards like
-`/_cat/thread_pool?h=ip,bulk.*` to get all headers (or aliases) starting
-with `bulk.`.
+`/_cat/thread_pool?h=ip,queue*` to get all headers (or aliases) starting
+with `queue`.
[float]
[[numeric-formats]]
diff --git a/docs/reference/cat/thread_pool.asciidoc b/docs/reference/cat/thread_pool.asciidoc
index bfc5ca415c3ba..306650feb958b 100644
--- a/docs/reference/cat/thread_pool.asciidoc
+++ b/docs/reference/cat/thread_pool.asciidoc
@@ -14,20 +14,20 @@ Which looks like:
[source,txt]
--------------------------------------------------
-node-0 bulk 0 0 0
+node-0 analyze 0 0 0
node-0 fetch_shard_started 0 0 0
node-0 fetch_shard_store 0 0 0
node-0 flush 0 0 0
node-0 force_merge 0 0 0
node-0 generic 0 0 0
node-0 get 0 0 0
-node-0 index 0 0 0
node-0 listener 0 0 0
node-0 management 1 0 0
node-0 refresh 0 0 0
node-0 search 0 0 0
node-0 snapshot 0 0 0
node-0 warmer 0 0 0
+node-0 write 0 0 0
--------------------------------------------------
// TESTRESPONSE[s/\d+/\\d+/ _cat]
@@ -43,20 +43,20 @@ The second column is the thread pool name
[source,txt]
--------------------------------------------------
name
-bulk
+analyze
fetch_shard_started
fetch_shard_store
flush
force_merge
generic
get
-index
listener
management
refresh
search
snapshot
warmer
+write
--------------------------------------------------
diff --git a/docs/reference/cluster/nodes-info.asciidoc b/docs/reference/cluster/nodes-info.asciidoc
index 2b91310da3a8e..6522d0f5ad68a 100644
--- a/docs/reference/cluster/nodes-info.asciidoc
+++ b/docs/reference/cluster/nodes-info.asciidoc
@@ -142,6 +142,8 @@ The result will look similar to:
"host": "node-0.elastic.co",
"ip": "192.168.17",
"version": "{version}",
+ "build_flavor": "oss",
+ "build_type": "zip",
"build_hash": "587409e",
"roles": [
"master",
@@ -235,6 +237,8 @@ The result will look similar to:
"host": "node-0.elastic.co",
"ip": "192.168.17",
"version": "{version}",
+ "build_flavor": "oss",
+ "build_type": "zip",
"build_hash": "587409e",
"roles": [],
"attributes": {},
diff --git a/docs/reference/cluster/nodes-stats.asciidoc b/docs/reference/cluster/nodes-stats.asciidoc
index ec25d27d2535f..eb3abb19d1adf 100644
--- a/docs/reference/cluster/nodes-stats.asciidoc
+++ b/docs/reference/cluster/nodes-stats.asciidoc
@@ -346,7 +346,6 @@ Supported metrics are:
* `search`
* `segments`
* `store`
-* `suggest`
* `translog`
* `warmer`
diff --git a/docs/reference/cluster/remote-info.asciidoc b/docs/reference/cluster/remote-info.asciidoc
index d044f4dcad221..3dfcc201e7ac4 100644
--- a/docs/reference/cluster/remote-info.asciidoc
+++ b/docs/reference/cluster/remote-info.asciidoc
@@ -19,9 +19,6 @@ the configured remote cluster alias.
`seeds`::
The configured initial seed transport addresses of the remote cluster.
-`http_addresses`::
- The published http addresses of all connected remote nodes.
-
`connected`::
True if there is at least one connection to the remote cluster.
diff --git a/docs/reference/docs/delete-by-query.asciidoc b/docs/reference/docs/delete-by-query.asciidoc
index be015a811e9b3..f9919483e5a47 100644
--- a/docs/reference/docs/delete-by-query.asciidoc
+++ b/docs/reference/docs/delete-by-query.asciidoc
@@ -284,9 +284,12 @@ executed again in order to conform to `requests_per_second`.
`failures`::
-Array of all indexing failures. If this is non-empty then the request aborted
-because of those failures. See `conflicts` for how to prevent version conflicts
-from aborting the operation.
+Array of failures if there were any unrecoverable errors during the process. If
+this is non-empty then the request aborted because of those failures.
+Delete-by-query is implemented using batches and any failure causes the entire
+process to abort but all failures in the current batch are collected into the
+array. You can use the `conflicts` option to prevent reindex from aborting on
+version conflicts.
[float]
diff --git a/docs/reference/docs/delete.asciidoc b/docs/reference/docs/delete.asciidoc
index 782a625586b87..49f31eb2d75fb 100644
--- a/docs/reference/docs/delete.asciidoc
+++ b/docs/reference/docs/delete.asciidoc
@@ -39,11 +39,14 @@ The result of the above delete operation is:
[[delete-versioning]]
=== Versioning
-Each document indexed is versioned. When deleting a document, the
-`version` can be specified to make sure the relevant document we are
-trying to delete is actually being deleted and it has not changed in the
-meantime. Every write operation executed on a document, deletes included,
-causes its version to be incremented.
+Each document indexed is versioned. When deleting a document, the `version` can
+be specified to make sure the relevant document we are trying to delete is
+actually being deleted and it has not changed in the meantime. Every write
+operation executed on a document, deletes included, causes its version to be
+incremented. The version number of a deleted document remains available for a
+short time after deletion to allow for control of concurrent operations. The
+length of time for which a deleted document's version remains available is
+determined by the `index.gc_deletes` index setting and defaults to 60 seconds.
[float]
[[delete-routing]]
diff --git a/docs/reference/docs/reindex.asciidoc b/docs/reference/docs/reindex.asciidoc
index 5f34371ab8467..e8283abfc2ef0 100644
--- a/docs/reference/docs/reindex.asciidoc
+++ b/docs/reference/docs/reindex.asciidoc
@@ -161,12 +161,12 @@ POST _reindex
`index` and `type` in `source` can both be lists, allowing you to copy from
lots of sources in one request. This will copy documents from the `_doc` and
-`post` types in the `twitter` and `blog` index. The copied documents would include the
-`post` type in the `twitter` index and the `_doc` type in the `blog` index. For more
+`post` types in the `twitter` and `blog` index. The copied documents would include the
+`post` type in the `twitter` index and the `_doc` type in the `blog` index. For more
specific parameters, you can use `query`.
-The Reindex API makes no effort to handle ID collisions. For such issues, the target index
-will remain valid, but it's not easy to predict which document will survive because
+The Reindex API makes no effort to handle ID collisions. For such issues, the target index
+will remain valid, but it's not easy to predict which document will survive because
the iteration order isn't well defined.
[source,js]
@@ -666,9 +666,11 @@ executed again in order to conform to `requests_per_second`.
`failures`::
-Array of all indexing failures. If this is non-empty then the request aborted
-because of those failures. See `conflicts` for how to prevent version conflicts
-from aborting the operation.
+Array of failures if there were any unrecoverable errors during the process. If
+this is non-empty then the request aborted because of those failures. Reindex
+is implemented using batches and any failure causes the entire process to abort
+but all failures in the current batch are collected into the array. You can use
+the `conflicts` option to prevent reindex from aborting on version conflicts.
[float]
[[docs-reindex-task-api]]
@@ -1004,7 +1006,7 @@ number for most indices. If slicing manually or otherwise tuning
automatic slicing, use these guidelines.
Query performance is most efficient when the number of `slices` is equal to the
-number of shards in the index. If that number is large (e.g. 500),
+number of shards in the index. If that number is large (e.g. 500),
choose a lower number as too many `slices` will hurt performance. Setting
`slices` higher than the number of shards generally does not improve efficiency
and adds overhead.
@@ -1018,7 +1020,7 @@ documents being reindexed and cluster resources.
[float]
=== Reindex daily indices
-You can use `_reindex` in combination with <>
+You can use `_reindex` in combination with <>
to reindex daily indices to apply a new template to the existing documents.
Assuming you have indices consisting of documents as follows:
diff --git a/docs/reference/docs/update-by-query.asciidoc b/docs/reference/docs/update-by-query.asciidoc
index 482f3d62f5d5d..1d81e4a44ff24 100644
--- a/docs/reference/docs/update-by-query.asciidoc
+++ b/docs/reference/docs/update-by-query.asciidoc
@@ -338,9 +338,13 @@ executed again in order to conform to `requests_per_second`.
`failures`::
-Array of all indexing failures. If this is non-empty then the request aborted
-because of those failures. See `conflicts` for how to prevent version conflicts
-from aborting the operation.
+Array of failures if there were any unrecoverable errors during the process. If
+this is non-empty then the request aborted because of those failures.
+Update-by-query is implemented using batches and any failure causes the entire
+process to abort but all failures in the current batch are collected into the
+array. You can use the `conflicts` option to prevent reindex from aborting on
+version conflicts.
+
[float]
diff --git a/docs/reference/index-modules.asciidoc b/docs/reference/index-modules.asciidoc
index 0ab742108b92f..ed0077a629d7c 100644
--- a/docs/reference/index-modules.asciidoc
+++ b/docs/reference/index-modules.asciidoc
@@ -214,6 +214,27 @@ specific index module:
The maximum length of regex that can be used in Regexp Query.
Defaults to `1000`.
+ `index.routing.allocation.enable`::
+
+ Controls shard allocation for this index. It can be set to:
+ * `all` (default) - Allows shard allocation for all shards.
+ * `primaries` - Allows shard allocation only for primary shards.
+ * `new_primaries` - Allows shard allocation only for newly-created primary shards.
+ * `none` - No shard allocation is allowed.
+
+ `index.routing.rebalance.enable`::
+
+ Enables shard rebalancing for this index. It can be set to:
+ * `all` (default) - Allows shard rebalancing for all shards.
+ * `primaries` - Allows shard rebalancing only for primary shards.
+ * `replicas` - Allows shard rebalancing only for replica shards.
+ * `none` - No shard rebalancing is allowed.
+
+ `index.gc_deletes`::
+
+ The length of time that a <> remains available for <>.
+ Defaults to `60s`.
+
[float]
=== Settings in other index modules
diff --git a/docs/reference/index-modules/merge.asciidoc b/docs/reference/index-modules/merge.asciidoc
index 97db09ba656c7..cc0613ec2870d 100644
--- a/docs/reference/index-modules/merge.asciidoc
+++ b/docs/reference/index-modules/merge.asciidoc
@@ -23,7 +23,8 @@ The merge scheduler supports the following _dynamic_ setting:
`index.merge.scheduler.max_thread_count`::
- The maximum number of threads that may be merging at once. Defaults to
+ The maximum number of threads on a single shard that may be merging at once.
+ Defaults to
`Math.max(1, Math.min(4, Runtime.getRuntime().availableProcessors() / 2))`
which works well for a good solid-state-disk (SSD). If your index is on
spinning platter drives instead, decrease this to 1.
diff --git a/docs/reference/index.x.asciidoc b/docs/reference/index.x.asciidoc
index bbfdf515bc72d..5be21cb004331 100644
--- a/docs/reference/index.x.asciidoc
+++ b/docs/reference/index.x.asciidoc
@@ -4,7 +4,7 @@
:include-xpack: true
:es-test-dir: {docdir}/../src/test
:plugins-examples-dir: {docdir}/../../plugins/examples
-:xes-repo-dir: {docdir}/../../../elasticsearch-extra/x-pack-elasticsearch/docs/{lang}
+:xes-repo-dir: {docdir}/../../x-pack/docs/{lang}
:es-repo-dir: {docdir}
diff --git a/docs/reference/mapping/removal_of_types.asciidoc b/docs/reference/mapping/removal_of_types.asciidoc
index 070d189a0fffe..95881ba83856f 100644
--- a/docs/reference/mapping/removal_of_types.asciidoc
+++ b/docs/reference/mapping/removal_of_types.asciidoc
@@ -258,15 +258,17 @@ Elasticsearch 6.x::
Elasticsearch 7.x::
-* The `type` parameter in URLs are optional. For instance, indexing
+* The `type` parameter in URLs are deprecated. For instance, indexing
a document no longer requires a document `type`. The new index APIs
are `PUT {index}/_doc/{id}` in case of explicit ids and `POST {index}/_doc`
for auto-generated ids.
-* The `GET|PUT _mapping` APIs support a query string parameter
- (`include_type_name`) which indicates whether the body should include
- a layer for the type name. It defaults to `true`. 7.x indices which
- don't have an explicit type will use the dummy type name `_doc`.
+* The index creation, `GET|PUT _mapping` and document APIs support a query
+ string parameter (`include_type_name`) which indicates whether requests and
+ responses should include a type name. It defaults to `true`.
+ 7.x indices which don't have an explicit type will use the dummy type name
+ `_doc`. Not setting `include_type_name=false` will result in a deprecation
+ warning.
* The `_default_` mapping type is removed.
@@ -274,7 +276,8 @@ Elasticsearch 8.x::
* The `type` parameter is no longer supported in URLs.
-* The `include_type_name` parameter defaults to `false`.
+* The `include_type_name` parameter is deprecated, default to `false` and fails
+ the request when set to `true`.
Elasticsearch 9.x::
@@ -421,3 +424,108 @@ POST _reindex
----
// NOTCONSOLE
+[float]
+=== Use `include_type_name=false` to prepare for upgrade to 8.0
+
+Index creation, mappings and document APIs support the `include_type_name`
+option. When set to `false`, this option enables the behavior that will become
+default in 8.0 when types are removed. See some examples of interactions with
+Elasticsearch with this option turned off:
+
+[float]
+==== Index creation
+
+[source,js]
+--------------------------------------------------
+PUT index?include_type_name=false
+{
+ "mappings": {
+ "properties": { <1>
+ "foo": {
+ "type": "keyword"
+ }
+ }
+ }
+}
+--------------------------------------------------
+// CONSOLE
+<1> Mappings are included directly under the `mappings` key, without a type name.
+
+[float]
+==== PUT and GET mappings
+
+[source,js]
+--------------------------------------------------
+PUT index
+
+PUT index/_mappings?include_type_name=false
+{
+ "properties": { <1>
+ "foo": {
+ "type": "keyword"
+ }
+ }
+}
+
+GET index/_mappings?include_type_name=false
+--------------------------------------------------
+// CONSOLE
+<1> Mappings are included directly under the `mappings` key, without a type name.
+
+
+The above call returns
+
+[source,js]
+--------------------------------------------------
+{
+ "index": {
+ "mappings": {
+ "properties": { <1>
+ "foo": {
+ "type": "keyword"
+ }
+ }
+ }
+ }
+}
+--------------------------------------------------
+// TESTRESPONSE
+<1> Mappings are included directly under the `mappings` key, without a type name.
+
+[float]
+==== Document APIs
+
+Index APIs must be call with the `{index}/_doc` path for automatic generation of
+the `_id` and `{index}/_doc/{id}` with explicit ids.
+
+[source,js]
+--------------------------------------------------
+PUT index/_doc/1?include_type_name=false
+{
+ "foo": "bar"
+}
+--------------------------------------------------
+// CONSOLE
+
+[source,js]
+--------------------------------------------------
+{
+ "_index": "index", <1>
+ "_id": "1",
+ "_version": 1,
+ "result": "created",
+ "_shards": {
+ "total": 2,
+ "successful": 1,
+ "failed": 0
+ },
+ "_seq_no": 0,
+ "_primary_term": 1
+}
+--------------------------------------------------
+// TESTRESPONSE
+<1> The response does not include a `_type`.
+
+Likewise the <>, <>,
+<> and <> APIs do not return a `_type`
+key in the response when `include_type_name` is set to `false`.
diff --git a/docs/reference/mapping/types/geo-point.asciidoc b/docs/reference/mapping/types/geo-point.asciidoc
index 57faef2dbd7db..97f2ddb52825b 100644
--- a/docs/reference/mapping/types/geo-point.asciidoc
+++ b/docs/reference/mapping/types/geo-point.asciidoc
@@ -122,6 +122,11 @@ The following parameters are accepted by `geo_point` fields:
ignored. If `false`, geo-points containing any more than latitude and longitude
(two dimensions) values throw an exception and reject the whole document.
+<>::
+
+ Accepts an geopoint value which is substituted for any explicit `null` values.
+ Defaults to `null`, which means the field is treated as missing.
+
==== Using geo-points in scripts
When accessing the value of a geo-point in a script, the value is returned as
diff --git a/docs/reference/mapping/types/token-count.asciidoc b/docs/reference/mapping/types/token-count.asciidoc
index da4220f4bb401..6f3295fab5ebb 100644
--- a/docs/reference/mapping/types/token-count.asciidoc
+++ b/docs/reference/mapping/types/token-count.asciidoc
@@ -81,7 +81,7 @@ Defaults to `true`.
<>::
- Should the field be searchable? Accepts `not_analyzed` (default) and `no`.
+ Should the field be searchable? Accepts `true` (default) and `false`.
<>::
diff --git a/docs/reference/migration/migrate_6_4.asciidoc b/docs/reference/migration/migrate_6_4.asciidoc
new file mode 100644
index 0000000000000..a761509597fd2
--- /dev/null
+++ b/docs/reference/migration/migrate_6_4.asciidoc
@@ -0,0 +1,12 @@
+[[breaking-changes-6.4]]
+== Breaking changes in 6.4
+
+[[breaking_64_api_changes]]
+=== API changes
+
+==== Field capabilities request format
+
+In the past, `fields` could be provided either as a parameter, or as part of the request
+body. Specifying `fields` in the request body is now deprecated, and instead they should
+always be supplied through a request parameter. In 7.0.0, the field capabilities API will
+not accept `fields` supplied in the request body.
diff --git a/docs/reference/migration/migrate_7_0/api.asciidoc b/docs/reference/migration/migrate_7_0/api.asciidoc
index f8b8f9670c7fa..fc037504c5128 100644
--- a/docs/reference/migration/migrate_7_0/api.asciidoc
+++ b/docs/reference/migration/migrate_7_0/api.asciidoc
@@ -52,3 +52,10 @@ and `size` will be populated for fixed thread pools.
and Update request. The Update API returns `400 - Bad request` if request contains
unknown parameters (instead of ignored in the previous version).
+[[remove-suggest-metric]]
+==== Remove support for `suggest` metric/index metric in indices stats and nodes stats APIs
+
+Previously, `suggest` stats were folded into `search` stats. Support for the
+`suggest` metric on the indices stats and nodes stats APIs remained for
+backwards compatibility. Backwards support for the `suggest` metric was
+deprecated in 6.3.0 and now removed in 7.0.0.
diff --git a/docs/reference/migration/migrate_7_0/settings.asciidoc b/docs/reference/migration/migrate_7_0/settings.asciidoc
index 1035bc73393ac..b09cecf5a48dc 100644
--- a/docs/reference/migration/migrate_7_0/settings.asciidoc
+++ b/docs/reference/migration/migrate_7_0/settings.asciidoc
@@ -5,4 +5,23 @@
==== Percolator
* The deprecated `index.percolator.map_unmapped_fields_as_string` setting has been removed in favour of
- the `index.percolator.map_unmapped_fields_as_text` setting.
\ No newline at end of file
+ the `index.percolator.map_unmapped_fields_as_text` setting.
+
+==== Index thread pool
+
+* Internally, single-document index/delete/update requests are executed as bulk
+ requests with a single-document payload. This means that these requests are
+ executed on the bulk thread pool. As such, the indexing thread pool is no
+ longer needed and has been removed. As such, the settings
+ `thread_pool.index.size` and `thread_pool.index.queue_size` have been removed.
+
+[[write-thread-pool-fallback]]
+==== Write thread pool fallback
+
+* The bulk thread pool was replaced by the write thread pool in 6.3.0. However,
+ for backwards compatibility reasons the name `bulk` was still usable as fallback
+ settings `thread_pool.bulk.size` and `thread_pool.bulk.queue_size` for
+ `thread_pool.write.size` and `thread_pool.write.queue_size`, respectively, and
+ the system property `es.thread_pool.write.use_bulk_as_display_name` was
+ available to keep the display output in APIs as `bulk` instead of `write`.
+ These fallback settings and this system property have been removed.
diff --git a/docs/reference/modules/cross-cluster-search.asciidoc b/docs/reference/modules/cross-cluster-search.asciidoc
index d3c6426f271ef..21e21edc35b57 100644
--- a/docs/reference/modules/cross-cluster-search.asciidoc
+++ b/docs/reference/modules/cross-cluster-search.asciidoc
@@ -222,8 +222,7 @@ GET /cluster_one:twitter/_search
// TESTRESPONSE[s/"_score": 1/"_score": "$body.hits.hits.0._score"/]
-In contrast to the `tribe` feature cross cluster search can also search indices with the same name on different
-clusters:
+Indices can also be searched with the same name on different clusters:
[source,js]
--------------------------------------------------
diff --git a/docs/reference/modules/snapshots.asciidoc b/docs/reference/modules/snapshots.asciidoc
index ea3f99debb94e..693d537d732c1 100644
--- a/docs/reference/modules/snapshots.asciidoc
+++ b/docs/reference/modules/snapshots.asciidoc
@@ -44,12 +44,12 @@ If you register same snapshot repository with multiple clusters, only
one cluster should have write access to the repository. All other clusters
connected to that repository should set the repository to `readonly` mode.
-NOTE: The snapshot format can change across major versions, so if you have
-clusters on different major versions trying to write the same repository,
-new snapshots written by one version will not be visible to the other. While
-setting the repository to `readonly` on all but one of the clusters should work
-with multiple clusters differing by one major version, it is not a supported
-configuration.
+IMPORTANT: The snapshot format can change across major versions, so if you have
+clusters on different versions trying to write the same repository, snapshots
+written by one version may not be visible to the other and the repository could
+be corrupted. While setting the repository to `readonly` on all but one of the
+clusters should work with multiple clusters differing by one major version, it
+is not a supported configuration.
[source,js]
-----------------------------------
diff --git a/docs/reference/modules/threadpool.asciidoc b/docs/reference/modules/threadpool.asciidoc
index 984bef0a3cc3c..515959e4ea580 100644
--- a/docs/reference/modules/threadpool.asciidoc
+++ b/docs/reference/modules/threadpool.asciidoc
@@ -13,12 +13,6 @@ There are several thread pools, but the important ones include:
For generic operations (e.g., background node discovery).
Thread pool type is `scaling`.
-`index`::
- For index/delete operations. Thread pool type is `fixed`
- with a size of `# of available processors`,
- queue_size of `200`. The maximum size for this pool
- is `1 + # of available processors`.
-
`search`::
For count/search/suggest operations. Thread pool type is
`fixed_auto_queue_size` with a size of
@@ -30,11 +24,13 @@ There are several thread pools, but the important ones include:
with a size of `# of available processors`,
queue_size of `1000`.
-`bulk`::
- For bulk operations. Thread pool type is `fixed`
- with a size of `# of available processors`,
- queue_size of `200`. The maximum size for this pool
- is `1 + # of available processors`.
+`analyze`::
+ For analyze requests. Thread pool type is `fixed` with a size of 1, queue size of 16.
+
+`write`::
+ For single-document index/delete/update and bulk requests. Thread pool type
+ is `fixed` with a size of `# of available processors`, queue_size of `200`.
+ The maximum size for this pool is `1 + # of available processors`.
`snapshot`::
For snapshot/restore operations. Thread pool type is `scaling` with a
@@ -52,13 +48,13 @@ There are several thread pools, but the important ones include:
Mainly for java client executing of action when listener threaded is set to true.
Thread pool type is `scaling` with a default max of `min(10, (# of available processors)/2)`.
-Changing a specific thread pool can be done by setting its type-specific parameters; for example, changing the `index`
+Changing a specific thread pool can be done by setting its type-specific parameters; for example, changing the `bulk`
thread pool to have more threads:
[source,yaml]
--------------------------------------------------
thread_pool:
- index:
+ bulk:
size: 30
--------------------------------------------------
@@ -86,7 +82,7 @@ full, it will abort the request.
[source,yaml]
--------------------------------------------------
thread_pool:
- index:
+ bulk:
size: 30
queue_size: 1000
--------------------------------------------------
diff --git a/docs/reference/modules/transport.asciidoc b/docs/reference/modules/transport.asciidoc
index 50c35a4a73634..b7a65d98592cc 100644
--- a/docs/reference/modules/transport.asciidoc
+++ b/docs/reference/modules/transport.asciidoc
@@ -41,7 +41,7 @@ addressable from the outside. Defaults to the actual port assigned via
|`transport.tcp.connect_timeout` |The socket connect timeout setting (in
time setting format). Defaults to `30s`.
-|`transport.tcp.compress` |Set to `true` to enable compression (LZF)
+|`transport.tcp.compress` |Set to `true` to enable compression (`DEFLATE`)
between all nodes. Defaults to `false`.
|`transport.ping_schedule` | Schedule a regular ping message to ensure that connections are kept alive. Defaults to `5s` in the transport client and `-1` (disabled) elsewhere.
diff --git a/docs/reference/query-dsl/match-phrase-query.asciidoc b/docs/reference/query-dsl/match-phrase-query.asciidoc
index 943d0e84d36db..1f4b19eedc132 100644
--- a/docs/reference/query-dsl/match-phrase-query.asciidoc
+++ b/docs/reference/query-dsl/match-phrase-query.asciidoc
@@ -39,3 +39,5 @@ GET /_search
}
--------------------------------------------------
// CONSOLE
+
+This query also accepts `zero_terms_query`, as explained in <>.
diff --git a/docs/reference/redirects.asciidoc b/docs/reference/redirects.asciidoc
index a17027fb3c335..1583726421aeb 100644
--- a/docs/reference/redirects.asciidoc
+++ b/docs/reference/redirects.asciidoc
@@ -489,7 +489,7 @@ Using `_index` in scripts has been replaced with writing `ScriptEngine` backends
=== Painless Syntax
See the
-{painless}/painless-specification.html[Painless Language Specification]
+{painless}/painless-lang-spec.html[Painless Language Specification]
in the guide to the {painless}/index.html[Painless Scripting Language].
[role="exclude",id="modules-scripting-painless-debugging"]
diff --git a/docs/reference/release-notes/7.0.0-alpha1.asciidoc b/docs/reference/release-notes/7.0.0-alpha1.asciidoc
index 128a9b7dd716b..1cc328f16598b 100644
--- a/docs/reference/release-notes/7.0.0-alpha1.asciidoc
+++ b/docs/reference/release-notes/7.0.0-alpha1.asciidoc
@@ -8,4 +8,11 @@ The changes listed below have been released for the first time in Elasticsearch
=== Breaking changes
Core::
-* Tribe node has been removed in favor of Cross-Cluster-Search
\ No newline at end of file
+* Tribe node has been removed in favor of Cross-Cluster-Search
+
+Cross-Cluster-Search::
+* `http_addresses` has been removed from the <> API
+ because it is expensive to fetch and no longer needed by Kibana.
+
+Rest API::
+* The Clear Cache API only supports `POST` as HTTP method
diff --git a/docs/reference/search/field-caps.asciidoc b/docs/reference/search/field-caps.asciidoc
index 8329d96131dff..6cb483e7a256e 100644
--- a/docs/reference/search/field-caps.asciidoc
+++ b/docs/reference/search/field-caps.asciidoc
@@ -20,7 +20,7 @@ GET twitter/_field_caps?fields=rating
// CONSOLE
// TEST[setup:twitter]
-Alternatively the `fields` option can also be defined in the request body:
+Alternatively the `fields` option can also be defined in the request body. deprecated[6.4.0, Please use a request parameter instead.]
[source,js]
--------------------------------------------------
@@ -30,6 +30,7 @@ POST _field_caps
}
--------------------------------------------------
// CONSOLE
+// TEST[warning:Specifying a request body is deprecated -- the [fields] request parameter should be used instead.]
This is equivalent to the previous request.
diff --git a/docs/reference/search/request/highlighters-internal.asciidoc b/docs/reference/search/request/highlighters-internal.asciidoc
new file mode 100644
index 0000000000000..651cdf917ced0
--- /dev/null
+++ b/docs/reference/search/request/highlighters-internal.asciidoc
@@ -0,0 +1,194 @@
+[[highlighter-internal-work]]
+==== How highlighters work internally
+
+Given a query and a text (the content of a document field), the goal of a
+highlighter is to find the best text fragments for the query, and highlight
+the query terms in the found fragments. For this, a highlighter needs to
+address several questions:
+
+- How break a text into fragments?
+- How to find the best fragments among all fragments?
+- How to highlight the query terms in a fragment?
+
+===== How to break a text into fragments?
+Relevant settings: `fragment_size`, `fragmenter`, `type` of highlighter,
+`boundary_chars`, `boundary_max_scan`, `boundary_scanner`, `boundary_scanner_locale`.
+
+Plain highlighter begins with analyzing the text using the given analyzer,
+and creating a token stream from it. Plain highlighter uses a very simple
+algorithm to break the token stream into fragments. It loops through terms in the token stream,
+and every time the current term's end_offset exceeds `fragment_size` multiplied by the number of
+created fragments, a new fragment is created. A little more computation is done with using `span`
+fragmenter to avoid breaking up text between highlighted terms. But overall, since the breaking is
+done only by `fragment_size`, some fragments can be quite odd, e.g. beginning
+with a punctuation mark.
+
+Unified or FVH highlighters do a better job of breaking up a text into
+fragments by utilizing Java's `BreakIterator`. This ensures that a fragment
+is a valid sentence as long as `fragment_size` allows for this.
+
+
+===== How to find the best fragments?
+Relevant settings: `number_of_fragments`.
+
+To find the best, most relevant, fragments, a highlighter needs to score
+each fragment in respect to the given query. The goal is to score only those
+terms that participated in generating the 'hit' on the document.
+For some complex queries, this is still work in progress.
+
+The plain highlighter creates an in-memory index from the current token stream,
+and re-runs the original query criteria through Lucene's query execution planner
+to get access to low-level match information for the current text.
+For more complex queries the original query could be converted to a span query,
+as span queries can handle phrases more accurately. Then this obtained low-level match
+information is used to score each individual fragment. The scoring method of the plain
+highlighter is quite simple. Each fragment is scored by the number of unique
+query terms found in this fragment. The score of individual term is equal to its boost,
+which is by default is 1. Thus, by default, a fragment that contains one unique query term,
+will get a score of 1; and a fragment that contains two unique query terms,
+will get a score of 2 and so on. The fragments are then sorted by their scores,
+so the highest scored fragments will be output first.
+
+FVH doesn't need to analyze the text and build an in-memory index, as it uses
+pre-indexed document term vectors, and finds among them terms that correspond to the query.
+FVH scores each fragment by the number of query terms found in this fragment.
+Similarly to plain highlighter, score of individual term is equal to its boost value.
+In contrast to plain highlighter, all query terms are counted, not only unique terms.
+
+Unified highlighter can use pre-indexed term vectors or pre-indexed terms offsets,
+if they are available. Otherwise, similar to Plain Highlighter, it has to create
+an in-memory index from the text. Unified highlighter uses the BM25 scoring model
+to score fragments.
+
+
+===== How to highlight the query terms in a fragment?
+Relevant settings: `pre-tags`, `post-tags`.
+
+The goal is to highlight only those terms that participated in generating the 'hit' on the document.
+For some complex boolean queries, this is still work in progress, as highlighters don't reflect
+the boolean logic of a query and only extract leaf (terms, phrases, prefix etc) queries.
+
+Plain highlighter given the token stream and the original text, recomposes the original text to
+highlight only terms from the token stream that are contained in the low-level match information
+structure from the previous step.
+
+FVH and unified highlighter use intermediate data structures to represent
+fragments in some raw form, and then populate them with actual text.
+
+A highlighter uses `pre-tags`, `post-tags` to encode highlighted terms.
+
+
+===== An example of the work of the unified highlighter
+
+Let's look in more details how unified highlighter works.
+
+First, we create a index with a text field `content`, that will be indexed
+using `english` analyzer, and will be indexed without offsets or term vectors.
+
+[source,js]
+--------------------------------------------------
+PUT test_index
+{
+ "mappings": {
+ "_doc": {
+ "properties": {
+ "content" : {
+ "type" : "text",
+ "analyzer" : "english"
+ }
+ }
+ }
+ }
+}
+--------------------------------------------------
+// NOTCONSOLE
+
+We put the following document into the index:
+
+[source,js]
+--------------------------------------------------
+PUT test_index/_doc/doc1
+{
+ "content" : "For you I'm only a fox like a hundred thousand other foxes. But if you tame me, we'll need each other. You'll be the only boy in the world for me. I'll be the only fox in the world for you."
+}
+--------------------------------------------------
+// NOTCONSOLE
+
+
+And we ran the following query with a highlight request:
+
+[source,js]
+--------------------------------------------------
+GET test_index/_search
+{
+ "query": {
+ "match_phrase" : {"content" : "only fox"}
+ },
+ "highlight": {
+ "type" : "unified",
+ "number_of_fragments" : 3,
+ "fields": {
+ "content": {}
+ }
+ }
+}
+--------------------------------------------------
+// NOTCONSOLE
+
+
+After `doc1` is found as a hit for this query, this hit will be passed to the
+unified highlighter for highlighting the field `content` of the document.
+Since the field `content` was not indexed either with offsets or term vectors,
+its raw field value will be analyzed, and in-memory index will be built from
+the terms that match the query:
+
+ {"token":"onli","start_offset":12,"end_offset":16,"position":3},
+ {"token":"fox","start_offset":19,"end_offset":22,"position":5},
+ {"token":"fox","start_offset":53,"end_offset":58,"position":11},
+ {"token":"onli","start_offset":117,"end_offset":121,"position":24},
+ {"token":"onli","start_offset":159,"end_offset":163,"position":34},
+ {"token":"fox","start_offset":164,"end_offset":167,"position":35}
+
+Our complex phrase query will be converted to the span query:
+`spanNear([text:onli, text:fox], 0, true)`, meaning that we are looking for
+terms "onli: and "fox" within 0 distance from each other, and in the given
+order. The span query will be run against the created before in-memory index,
+to find the following match:
+
+ {"term":"onli", "start_offset":159, "end_offset":163},
+ {"term":"fox", "start_offset":164, "end_offset":167}
+
+In our example, we have got a single match, but there could be several matches.
+Given the matches, the unified highlighter breaks the text of the field into
+so called "passages". Each passage must contain at least one match.
+The unified highlighter with the use of Java's `BreakIterator` ensures that each
+passage represents a full sentence as long as it doesn't exceed `fragment_size`.
+For our example, we have got a single passage with the following properties
+(showing only a subset of the properties here):
+
+ Passage:
+ startOffset: 147
+ endOffset: 189
+ score: 3.7158387
+ matchStarts: [159, 164]
+ matchEnds: [163, 167]
+ numMatches: 2
+
+Notice how a passage has a score, calculated using the BM25 scoring formula
+adapted for passages. Scores allow us to choose the best scoring
+passages if there are more passages available than the requested
+by the user `number_of_fragments`. Scores also let us to sort passages by
+`order: "score"` if requested by the user.
+
+As the final step, the unified highlighter will extract from the field's text
+a string corresponding to each passage:
+
+ "I'll be the only fox in the world for you."
+
+and will format with the tags and all matches in this string
+using the passages's `matchStarts` and `matchEnds` information:
+
+ I'll be the onlyfox in the world for you.
+
+This kind of formatted strings are the final result of the highlighter returned
+to the user.
\ No newline at end of file
diff --git a/docs/reference/search/request/highlighting.asciidoc b/docs/reference/search/request/highlighting.asciidoc
index a6d7bcf1415d6..2da11c14b5804 100644
--- a/docs/reference/search/request/highlighting.asciidoc
+++ b/docs/reference/search/request/highlighting.asciidoc
@@ -7,6 +7,11 @@ When you request highlights, the response contains an additional `highlight`
element for each search hit that includes the highlighted fields and the
highlighted fragments.
+NOTE: Highlighters don't reflect the boolean logic of a query when extracting
+ terms to highlight. Thus, for some complex boolean queries (e.g nested boolean
+ queries, queries using `minimum_should_match` etc.), parts of documents may be
+ highlighted that don't correspond to query matches.
+
Highlighting requires the actual content of a field. If the field is not
stored (the mapping does not set `store` to `true`), the actual `_source` is
loaded and the relevant field is extracted from `_source`.
@@ -88,7 +93,7 @@ the highlighted documents. This is important if you have large fields because
it doesn't require reanalyzing the text to be highlighted. It also requires less
disk space than using `term_vectors`.
-* Term vectors. If `term_vector` information is provided by setting
+* Term vectors. If `term_vector` information is provided by setting
`term_vector` to `with_positions_offsets` in the mapping, the `unified`
highlighter automatically uses the `term_vector` to highlight the field.
It's fast especially for large fields (> `1MB`) and for highlighting multi-term queries like
@@ -127,7 +132,7 @@ the `fvh` highlighter.
boundaries. The `boundary_max_scan` setting controls how far to scan for
boundary characters. Only valid for the `fvh` highlighter.
`sentence`::: Break highlighted fragments at the next sentence boundary, as
-determined by Java's
+determined by Java's
https://docs.oracle.com/javase/8/docs/api/java/text/BreakIterator.html[BreakIterator].
You can specify the locale to use with `boundary_scanner_locale`.
+
@@ -140,7 +145,10 @@ by Java's https://docs.oracle.com/javase/8/docs/api/java/text/BreakIterator.html
You can specify the locale to use with `boundary_scanner_locale`.
boundary_scanner_locale:: Controls which locale is used to search for sentence
-and word boundaries.
+and word boundaries. This parameter takes a form of a language tag,
+e.g. `"en-US"`, `"fr-FR"`, `"ja-JP"`. More info can be found in the
+https://docs.oracle.com/javase/8/docs/api/java/util/Locale.html#forLanguageTag-java.lang.String-[Locale Language Tag]
+documentation. The default value is https://docs.oracle.com/javase/8/docs/api/java/util/Locale.html#ROOT[ Locale.ROOT].
encoder:: Indicates if the snippet should be HTML encoded:
`default` (no encoding) or `html` (HTML-escape the snippet text and then
@@ -200,12 +208,16 @@ handy when you need to highlight short texts such as a title or
address, but fragmentation is not required. If `number_of_fragments`
is 0, `fragment_size` is ignored. Defaults to 5.
-order:: Sorts highlighted fragments by score when set to `score`. Only valid for
-the `unified` highlighter.
+order:: Sorts highlighted fragments by score when set to `score`. By default,
+fragments will be output in the order they appear in the field (order: `none`).
+Setting this option to `score` will output the most relevant fragments first.
+Each highlighter applies its own logic to compute relevancy scores. See
+the document <>
+for more details how different highlighters find the best fragments.
phrase_limit:: Controls the number of matching phrases in a document that are
considered. Prevents the `fvh` highlighter from analyzing too many phrases
-and consuming too much memory. When using `matched_fields, `phrase_limit`
+and consuming too much memory. When using `matched_fields`, `phrase_limit`
phrases per matched field are considered. Raising the limit increases query
time and consumes more memory. Only supported by the `fvh` highlighter.
Defaults to 256.
@@ -929,3 +941,6 @@ Response:
If the `number_of_fragments` option is set to `0`,
`NullFragmenter` is used which does not fragment the text at all.
This is useful for highlighting the entire contents of a document or field.
+
+
+include::highlighters-internal.asciidoc[]
diff --git a/docs/reference/search/request/scroll.asciidoc b/docs/reference/search/request/scroll.asciidoc
index be725aaf362f5..0fd6979ef9568 100644
--- a/docs/reference/search/request/scroll.asciidoc
+++ b/docs/reference/search/request/scroll.asciidoc
@@ -78,9 +78,9 @@ returned with each batch of results. Each call to the `scroll` API returns the
next batch of results until there are no more results left to return, ie the
`hits` array is empty.
-IMPORTANT: The initial search request and each subsequent scroll request
-returns a new `_scroll_id` -- only the most recent `_scroll_id` should be
-used.
+IMPORTANT: The initial search request and each subsequent scroll request each
+return a `_scroll_id`, which may change with each request -- only the most
+recent `_scroll_id` should be used.
NOTE: If the request specifies aggregations, only the initial search response
will contain the aggregations results.
diff --git a/docs/reference/setup/install/check-running.asciidoc b/docs/reference/setup/install/check-running.asciidoc
index 3ec10c26346bc..0cfc4b329ecfa 100644
--- a/docs/reference/setup/install/check-running.asciidoc
+++ b/docs/reference/setup/install/check-running.asciidoc
@@ -19,6 +19,8 @@ which should give you a response something like this:
"cluster_uuid" : "AT69_T_DTp-1qgIJlatQqA",
"version" : {
"number" : "{version}",
+ "build_flavor" : "oss",
+ "build_type" : "zip",
"build_hash" : "f27399d",
"build_date" : "2016-03-30T09:51:41.449Z",
"build_snapshot" : false,
diff --git a/gradle/wrapper/gradle-wrapper.jar b/gradle/wrapper/gradle-wrapper.jar
index 27768f1bbac3c..a5fe1cb94b9ee 100644
Binary files a/gradle/wrapper/gradle-wrapper.jar and b/gradle/wrapper/gradle-wrapper.jar differ
diff --git a/gradle/wrapper/gradle-wrapper.properties b/gradle/wrapper/gradle-wrapper.properties
index 8d6b7c2cbd9ba..7962563f742fe 100644
--- a/gradle/wrapper/gradle-wrapper.properties
+++ b/gradle/wrapper/gradle-wrapper.properties
@@ -1,6 +1,6 @@
-distributionUrl=https\://services.gradle.org/distributions/gradle-4.5-all.zip
+distributionUrl=https\://services.gradle.org/distributions/gradle-4.7-all.zip
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStorePath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
-distributionSha256Sum=6ac2f8f9302f50241bf14cc5f4a3d88504ad20e61bb98c5fd048f7723b61397e
+distributionSha256Sum=203f4537da8b8075e38c036a6d14cb71b1149de5bf0a8f6db32ac2833a1d1294
diff --git a/libs/build.gradle b/libs/build.gradle
index 78eb93886243d..7f24f69eedc2e 100644
--- a/libs/build.gradle
+++ b/libs/build.gradle
@@ -34,6 +34,7 @@ subprojects {
Project depProject = dependencyToProject(dep)
if (depProject != null
&& false == depProject.path.equals(':libs:elasticsearch-core')
+ && false == isEclipse
&& depProject.path.startsWith(':libs')) {
throw new InvalidUserDataException("projects in :libs "
+ "may not depend on other projects libs except "
diff --git a/licenses/APACHE-LICENSE-2.0.txt b/licenses/APACHE-LICENSE-2.0.txt
new file mode 100644
index 0000000000000..d645695673349
--- /dev/null
+++ b/licenses/APACHE-LICENSE-2.0.txt
@@ -0,0 +1,202 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
+
+ APPENDIX: How to apply the Apache License to your work.
+
+ To apply the Apache License to your work, attach the following
+ boilerplate notice, with the fields enclosed by brackets "[]"
+ replaced with your own identifying information. (Don't include
+ the brackets!) The text should be enclosed in the appropriate
+ comment syntax for the file format. We also recommend that a
+ file or class name and description of purpose be included on the
+ same "printed page" as the copyright notice for easier
+ identification within third-party archives.
+
+ Copyright [yyyy] [name of copyright owner]
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
diff --git a/licenses/ELASTIC-LICENSE.txt b/licenses/ELASTIC-LICENSE.txt
new file mode 100644
index 0000000000000..7376ffc3ff107
--- /dev/null
+++ b/licenses/ELASTIC-LICENSE.txt
@@ -0,0 +1,223 @@
+ELASTIC LICENSE AGREEMENT
+
+PLEASE READ CAREFULLY THIS ELASTIC LICENSE AGREEMENT (THIS "AGREEMENT"), WHICH
+CONSTITUTES A LEGALLY BINDING AGREEMENT AND GOVERNS ALL OF YOUR USE OF ALL OF
+THE ELASTIC SOFTWARE WITH WHICH THIS AGREEMENT IS INCLUDED ("ELASTIC SOFTWARE")
+THAT IS PROVIDED IN OBJECT CODE FORMAT, AND, IN ACCORDANCE WITH SECTION 2 BELOW,
+CERTAIN OF THE ELASTIC SOFTWARE THAT IS PROVIDED IN SOURCE CODE FORMAT. BY
+INSTALLING OR USING ANY OF THE ELASTIC SOFTWARE GOVERNED BY THIS AGREEMENT, YOU
+ARE ASSENTING TO THE TERMS AND CONDITIONS OF THIS AGREEMENT. IF YOU DO NOT AGREE
+WITH SUCH TERMS AND CONDITIONS, YOU MAY NOT INSTALL OR USE THE ELASTIC SOFTWARE
+GOVERNED BY THIS AGREEMENT. IF YOU ARE INSTALLING OR USING THE SOFTWARE ON
+BEHALF OF A LEGAL ENTITY, YOU REPRESENT AND WARRANT THAT YOU HAVE THE ACTUAL
+AUTHORITY TO AGREE TO THE TERMS AND CONDITIONS OF THIS AGREEMENT ON BEHALF OF
+SUCH ENTITY.
+
+Posted Date: April 20, 2018
+
+This Agreement is entered into by and between Elasticsearch BV ("Elastic") and
+You, or the legal entity on behalf of whom You are acting (as applicable,
+"You").
+
+1. OBJECT CODE END USER LICENSES, RESTRICTIONS AND THIRD PARTY OPEN SOURCE
+SOFTWARE
+
+ 1.1 Object Code End User License. Subject to the terms and conditions of
+ Section 1.2 of this Agreement, Elastic hereby grants to You, AT NO CHARGE and
+ for so long as you are not in breach of any provision of this Agreement, a
+ License to the Basic Features and Functions of the Elastic Software.
+
+ 1.2 Reservation of Rights; Restrictions. As between Elastic and You, Elastic
+ and its licensors own all right, title and interest in and to the Elastic
+ Software, and except as expressly set forth in Sections 1.1, and 2.1 of this
+ Agreement, no other license to the Elastic Software is granted to You under
+ this Agreement, by implication, estoppel or otherwise. You agree not to: (i)
+ reverse engineer or decompile, decrypt, disassemble or otherwise reduce any
+ Elastic Software provided to You in Object Code, or any portion thereof, to
+ Source Code, except and only to the extent any such restriction is prohibited
+ by applicable law, (ii) except as expressly permitted in this Agreement,
+ prepare derivative works from, modify, copy or use the Elastic Software Object
+ Code or the Commercial Software Source Code in any manner; (iii) except as
+ expressly permitted in Section 1.1 above, transfer, sell, rent, lease,
+ distribute, sublicense, loan or otherwise transfer, Elastic Software Object
+ Code, in whole or in part, to any third party; (iv) use Elastic Software
+ Object Code for providing time-sharing services, any software-as-a-service,
+ service bureau services or as part of an application services provider or
+ other service offering (collectively, "SaaS Offering") where obtaining access
+ to the Elastic Software or the features and functions of the Elastic Software
+ is a primary reason or substantial motivation for users of the SaaS Offering
+ to access and/or use the SaaS Offering ("Prohibited SaaS Offering"); (v)
+ circumvent the limitations on use of Elastic Software provided to You in
+ Object Code format that are imposed or preserved by any License Key, or (vi)
+ alter or remove any Marks and Notices in the Elastic Software. If You have any
+ question as to whether a specific SaaS Offering constitutes a Prohibited SaaS
+ Offering, or are interested in obtaining Elastic's permission to engage in
+ commercial or non-commercial distribution of the Elastic Software, please
+ contact elastic_license@elastic.co.
+
+ 1.3 Third Party Open Source Software. The Commercial Software may contain or
+ be provided with third party open source libraries, components, utilities and
+ other open source software (collectively, "Open Source Software"), which Open
+ Source Software may have applicable license terms as identified on a website
+ designated by Elastic. Notwithstanding anything to the contrary herein, use of
+ the Open Source Software shall be subject to the license terms and conditions
+ applicable to such Open Source Software, to the extent required by the
+ applicable licensor (which terms shall not restrict the license rights granted
+ to You hereunder, but may contain additional rights). To the extent any
+ condition of this Agreement conflicts with any license to the Open Source
+ Software, the Open Source Software license will govern with respect to such
+ Open Source Software only. Elastic may also separately provide you with
+ certain open source software that is licensed by Elastic. Your use of such
+ Elastic open source software will not be governed by this Agreement, but by
+ the applicable open source license terms.
+
+2. COMMERCIAL SOFTWARE SOURCE CODE
+
+ 2.1 Limited License. Subject to the terms and conditions of Section 2.2 of
+ this Agreement, Elastic hereby grants to You, AT NO CHARGE and for so long as
+ you are not in breach of any provision of this Agreement, a limited,
+ non-exclusive, non-transferable, fully paid up royalty free right and license
+ to the Commercial Software in Source Code format, without the right to grant
+ or authorize sublicenses, to prepare Derivative Works of the Commercial
+ Software, provided You (i) do not hack the licensing mechanism, or otherwise
+ circumvent the intended limitations on the use of Elastic Software to enable
+ features other than Basic Features and Functions or those features You are
+ entitled to as part of a Subscription, and (ii) use the resulting object code
+ only for reasonable testing purposes.
+
+ 2.2 Restrictions. Nothing in Section 2.1 grants You the right to (i) use the
+ Commercial Software Source Code other than in accordance with Section 2.1
+ above, (ii) use a Derivative Work of the Commercial Software outside of a
+ Non-production Environment, in any production capacity, on a temporary or
+ permanent basis, or (iii) transfer, sell, rent, lease, distribute, sublicense,
+ loan or otherwise make available the Commercial Software Source Code, in whole
+ or in part, to any third party. Notwithstanding the foregoing, You may
+ maintain a copy of the repository in which the Source Code of the Commercial
+ Software resides and that copy may be publicly accessible, provided that you
+ include this Agreement with Your copy of the repository.
+
+3. TERMINATION
+
+ 3.1 Termination. This Agreement will automatically terminate, whether or not
+ You receive notice of such Termination from Elastic, if You breach any of its
+ provisions.
+
+ 3.2 Post Termination. Upon any termination of this Agreement, for any reason,
+ You shall promptly cease the use of the Elastic Software in Object Code format
+ and cease use of the Commercial Software in Source Code format. For the
+ avoidance of doubt, termination of this Agreement will not affect Your right
+ to use Elastic Software, in either Object Code or Source Code formats, made
+ available under the Apache License Version 2.0.
+
+ 3.3 Survival. Sections 1.2, 2.2. 3.3, 4 and 5 shall survive any termination or
+ expiration of this Agreement.
+
+4. DISCLAIMER OF WARRANTIES AND LIMITATION OF LIABILITY
+
+ 4.1 Disclaimer of Warranties. TO THE MAXIMUM EXTENT PERMITTED UNDER APPLICABLE
+ LAW, THE ELASTIC SOFTWARE IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND,
+ AND ELASTIC AND ITS LICENSORS MAKE NO WARRANTIES WHETHER EXPRESSED, IMPLIED OR
+ STATUTORY REGARDING OR RELATING TO THE ELASTIC SOFTWARE. TO THE MAXIMUM EXTENT
+ PERMITTED UNDER APPLICABLE LAW, ELASTIC AND ITS LICENSORS SPECIFICALLY
+ DISCLAIM ALL IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
+ PURPOSE AND NON-INFRINGEMENT WITH RESPECT TO THE ELASTIC SOFTWARE, AND WITH
+ RESPECT TO THE USE OF THE FOREGOING. FURTHER, ELASTIC DOES NOT WARRANT RESULTS
+ OF USE OR THAT THE ELASTIC SOFTWARE WILL BE ERROR FREE OR THAT THE USE OF THE
+ ELASTIC SOFTWARE WILL BE UNINTERRUPTED.
+
+ 4.2 Limitation of Liability. IN NO EVENT SHALL ELASTIC OR ITS LICENSORS BE
+ LIABLE TO YOU OR ANY THIRD PARTY FOR ANY DIRECT OR INDIRECT DAMAGES,
+ INCLUDING, WITHOUT LIMITATION, FOR ANY LOSS OF PROFITS, LOSS OF USE, BUSINESS
+ INTERRUPTION, LOSS OF DATA, COST OF SUBSTITUTE GOODS OR SERVICES, OR FOR ANY
+ SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES OF ANY KIND, IN CONNECTION WITH
+ OR ARISING OUT OF THE USE OR INABILITY TO USE THE ELASTIC SOFTWARE, OR THE
+ PERFORMANCE OF OR FAILURE TO PERFORM THIS AGREEMENT, WHETHER ALLEGED AS A
+ BREACH OF CONTRACT OR TORTIOUS CONDUCT, INCLUDING NEGLIGENCE, EVEN IF ELASTIC
+ HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
+
+5. MISCELLANEOUS
+
+ This Agreement completely and exclusively states the entire agreement of the
+ parties regarding the subject matter herein, and it supersedes, and its terms
+ govern, all prior proposals, agreements, or other communications between the
+ parties, oral or written, regarding such subject matter. This Agreement may be
+ modified by Elastic from time to time, and any such modifications will be
+ effective upon the "Posted Date" set forth at the top of the modified
+ Agreement. If any provision hereof is held unenforceable, this Agreement will
+ continue without said provision and be interpreted to reflect the original
+ intent of the parties. This Agreement and any non-contractual obligation
+ arising out of or in connection with it, is governed exclusively by Dutch law.
+ This Agreement shall not be governed by the 1980 UN Convention on Contracts
+ for the International Sale of Goods. All disputes arising out of or in
+ connection with this Agreement, including its existence and validity, shall be
+ resolved by the courts with jurisdiction in Amsterdam, The Netherlands, except
+ where mandatory law provides for the courts at another location in The
+ Netherlands to have jurisdiction. The parties hereby irrevocably waive any and
+ all claims and defenses either might otherwise have in any such action or
+ proceeding in any of such courts based upon any alleged lack of personal
+ jurisdiction, improper venue, forum non conveniens or any similar claim or
+ defense. A breach or threatened breach, by You of Section 2 may cause
+ irreparable harm for which damages at law may not provide adequate relief, and
+ therefore Elastic shall be entitled to seek injunctive relief without being
+ required to post a bond. You may not assign this Agreement (including by
+ operation of law in connection with a merger or acquisition), in whole or in
+ part to any third party without the prior written consent of Elastic, which
+ may be withheld or granted by Elastic in its sole and absolute discretion.
+ Any assignment in violation of the preceding sentence is void. Notices to
+ Elastic may also be sent to legal@elastic.co.
+
+6. DEFINITIONS
+
+ The following terms have the meanings ascribed:
+
+ 6.1 "Affiliate" means, with respect to a party, any entity that controls, is
+ controlled by, or which is under common control with, such party, where
+ "control" means ownership of at least fifty percent (50%) of the outstanding
+ voting shares of the entity, or the contractual right to establish policy for,
+ and manage the operations of, the entity.
+
+ 6.2 "Basic Features and Functions" means those features and functions of the
+ Elastic Software that are eligible for use under a Basic license, as set forth
+ at https://www.elastic.co/subscriptions, as may be modified by Elastic from
+ time to time.
+
+ 6.3 "Commercial Software" means the Elastic Software Source Code in any file
+ containing a header stating the contents are subject to the Elastic License or
+ which is contained in the repository folder labeled "x-pack", unless a LICENSE
+ file present in the directory subtree declares a different license.
+
+ 6.4 "Derivative Work of the Commercial Software" means, for purposes of this
+ Agreement, any modification(s) or enhancement(s) to the Commercial Software,
+ which represent, as a whole, an original work of authorship.
+
+ 6.5 "License" means a limited, non-exclusive, non-transferable, fully paid up,
+ royalty free, right and license, without the right to grant or authorize
+ sublicenses, solely for Your internal business operations to (i) install and
+ use the applicable Features and Functions of the Elastic Software in Object
+ Code, and (ii) permit Contractors and Your Affiliates to use the Elastic
+ software as set forth in (i) above, provided that such use by Contractors must
+ be solely for Your benefit and/or the benefit of Your Affiliates, and You
+ shall be responsible for all acts and omissions of such Contractors and
+ Affiliates in connection with their use of the Elastic software that are
+ contrary to the terms and conditions of this Agreement.
+
+ 6.6 "License Key" means a sequence of bytes, including but not limited to a
+ JSON blob, that is used to enable certain features and functions of the
+ Elastic Software.
+
+ 6.7 "Marks and Notices" means all Elastic trademarks, trade names, logos and
+ notices present on the Documentation as originally provided by Elastic.
+
+ 6.8 "Non-production Environment" means an environment for development, testing
+ or quality assurance, where software is not used for production purposes.
+
+ 6.9 "Object Code" means any form resulting from mechanical transformation or
+ translation of Source Code form, including but not limited to compiled object
+ code, generated documentation, and conversions to other media types.
+
+ 6.10 "Source Code" means the preferred form of computer software for making
+ modifications, including but not limited to software source code,
+ documentation source, and configuration files.
+
+ 6.11 "Subscription" means the right to receive Support Services and a License
+ to the Commercial Software.
diff --git a/modules/analysis-common/src/main/java/org/elasticsearch/analysis/common/CommonAnalysisPlugin.java b/modules/analysis-common/src/main/java/org/elasticsearch/analysis/common/CommonAnalysisPlugin.java
index e0193e50313f3..a01eb52fdd498 100644
--- a/modules/analysis-common/src/main/java/org/elasticsearch/analysis/common/CommonAnalysisPlugin.java
+++ b/modules/analysis-common/src/main/java/org/elasticsearch/analysis/common/CommonAnalysisPlugin.java
@@ -67,6 +67,8 @@
import org.apache.lucene.analysis.standard.ClassicFilter;
import org.apache.lucene.analysis.tr.ApostropheFilter;
import org.apache.lucene.analysis.util.ElisionFilter;
+import org.elasticsearch.common.logging.DeprecationLogger;
+import org.elasticsearch.common.logging.Loggers;
import org.elasticsearch.index.analysis.CharFilterFactory;
import org.elasticsearch.index.analysis.PreConfiguredCharFilter;
import org.elasticsearch.index.analysis.PreConfiguredTokenFilter;
@@ -88,6 +90,9 @@
import static org.elasticsearch.plugins.AnalysisPlugin.requriesAnalysisSettings;
public class CommonAnalysisPlugin extends Plugin implements AnalysisPlugin {
+
+ private static final DeprecationLogger DEPRECATION_LOGGER = new DeprecationLogger(Loggers.getLogger(CommonAnalysisPlugin.class));
+
@Override
public Map> getTokenFilters() {
Map> filters = new TreeMap<>();
@@ -171,8 +176,14 @@ public Map> getTokenizers() {
public List getPreConfiguredCharFilters() {
List filters = new ArrayList<>();
filters.add(PreConfiguredCharFilter.singleton("html_strip", false, HTMLStripCharFilter::new));
- // TODO deprecate htmlStrip
- filters.add(PreConfiguredCharFilter.singleton("htmlStrip", false, HTMLStripCharFilter::new));
+ filters.add(PreConfiguredCharFilter.singletonWithVersion("htmlStrip", false, (reader, version) -> {
+ if (version.onOrAfter(org.elasticsearch.Version.V_6_3_0)) {
+ DEPRECATION_LOGGER.deprecatedAndMaybeLog("htmlStrip_deprecation",
+ "The [htmpStrip] char filter name is deprecated and will be removed in a future version. "
+ + "Please change the filter name to [html_strip] instead.");
+ }
+ return new HTMLStripCharFilter(reader);
+ }));
return filters;
}
diff --git a/modules/analysis-common/src/test/java/org/elasticsearch/analysis/common/HtmlStripCharFilterFactoryTests.java b/modules/analysis-common/src/test/java/org/elasticsearch/analysis/common/HtmlStripCharFilterFactoryTests.java
new file mode 100644
index 0000000000000..0d5389a6d6594
--- /dev/null
+++ b/modules/analysis-common/src/test/java/org/elasticsearch/analysis/common/HtmlStripCharFilterFactoryTests.java
@@ -0,0 +1,73 @@
+/*
+ * Licensed to Elasticsearch under one or more contributor
+ * license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright
+ * ownership. Elasticsearch licenses this file to you under
+ * the Apache License, Version 2.0 (the "License"); you may
+ * not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.elasticsearch.analysis.common;
+
+import org.elasticsearch.Version;
+import org.elasticsearch.cluster.metadata.IndexMetaData;
+import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.env.Environment;
+import org.elasticsearch.index.IndexSettings;
+import org.elasticsearch.index.analysis.CharFilterFactory;
+import org.elasticsearch.test.ESTestCase;
+import org.elasticsearch.test.IndexSettingsModule;
+import org.elasticsearch.test.VersionUtils;
+
+import java.io.IOException;
+import java.io.StringReader;
+import java.util.Map;
+
+
+public class HtmlStripCharFilterFactoryTests extends ESTestCase {
+
+ /**
+ * Check that the deprecated name "htmlStrip" issues a deprecation warning for indices created since 6.3.0
+ */
+ public void testDeprecationWarning() throws IOException {
+ Settings settings = Settings.builder().put(Environment.PATH_HOME_SETTING.getKey(), createTempDir())
+ .put(IndexMetaData.SETTING_VERSION_CREATED, VersionUtils.randomVersionBetween(random(), Version.V_6_3_0, Version.CURRENT))
+ .build();
+
+ IndexSettings idxSettings = IndexSettingsModule.newIndexSettings("index", settings);
+ try (CommonAnalysisPlugin commonAnalysisPlugin = new CommonAnalysisPlugin()) {
+ Map charFilters = createTestAnalysis(idxSettings, settings, commonAnalysisPlugin).charFilter;
+ CharFilterFactory charFilterFactory = charFilters.get("htmlStrip");
+ assertNotNull(charFilterFactory.create(new StringReader("input")));
+ assertWarnings("The [htmpStrip] char filter name is deprecated and will be removed in a future version. "
+ + "Please change the filter name to [html_strip] instead.");
+ }
+ }
+
+ /**
+ * Check that the deprecated name "htmlStrip" does NOT issues a deprecation warning for indices created before 6.3.0
+ */
+ public void testNoDeprecationWarningPre6_3() throws IOException {
+ Settings settings = Settings.builder().put(Environment.PATH_HOME_SETTING.getKey(), createTempDir())
+ .put(IndexMetaData.SETTING_VERSION_CREATED,
+ VersionUtils.randomVersionBetween(random(), Version.V_5_0_0, Version.V_6_2_4))
+ .build();
+
+ IndexSettings idxSettings = IndexSettingsModule.newIndexSettings("index", settings);
+ try (CommonAnalysisPlugin commonAnalysisPlugin = new CommonAnalysisPlugin()) {
+ Map charFilters = createTestAnalysis(idxSettings, settings, commonAnalysisPlugin).charFilter;
+ CharFilterFactory charFilterFactory = charFilters.get("htmlStrip");
+ assertNotNull(charFilterFactory.create(new StringReader("")));
+ }
+ }
+}
diff --git a/modules/analysis-common/src/test/resources/rest-api-spec/test/indices.analyze/10_analyze.yml b/modules/analysis-common/src/test/resources/rest-api-spec/test/indices.analyze/10_analyze.yml
index cbb8f053cfbba..f8fc3acc02c4c 100644
--- a/modules/analysis-common/src/test/resources/rest-api-spec/test/indices.analyze/10_analyze.yml
+++ b/modules/analysis-common/src/test/resources/rest-api-spec/test/indices.analyze/10_analyze.yml
@@ -17,3 +17,56 @@
- match: { error.type: illegal_argument_exception }
- match: { error.reason: "Custom normalizer may not use filter [word_delimiter]" }
+---
+"htmlStrip_deprecated":
+ - skip:
+ version: " - 6.2.99"
+ reason: deprecated in 6.3
+ features: "warnings"
+
+ - do:
+ indices.create:
+ index: test_deprecated_htmlstrip
+ body:
+ settings:
+ index:
+ analysis:
+ analyzer:
+ my_htmlStripWithCharfilter:
+ tokenizer: keyword
+ char_filter: ["htmlStrip"]
+ mappings:
+ type:
+ properties:
+ name:
+ type: text
+ analyzer: my_htmlStripWithCharfilter
+
+ - do:
+ warnings:
+ - 'The [htmpStrip] char filter name is deprecated and will be removed in a future version. Please change the filter name to [html_strip] instead.'
+ index:
+ index: test_deprecated_htmlstrip
+ type: type
+ id: 1
+ body: { "name": "foo bar" }
+
+ - do:
+ warnings:
+ - 'The [htmpStrip] char filter name is deprecated and will be removed in a future version. Please change the filter name to [html_strip] instead.'
+ index:
+ index: test_deprecated_htmlstrip
+ type: type
+ id: 2
+ body: { "name": "foo baz" }
+
+ - do:
+ warnings:
+ - 'The [htmpStrip] char filter name is deprecated and will be removed in a future version. Please change the filter name to [html_strip] instead.'
+ indices.analyze:
+ index: test_deprecated_htmlstrip
+ body:
+ analyzer: "my_htmlStripWithCharfilter"
+ text: "foo"
+ - length: { tokens: 1 }
+ - match: { tokens.0.token: "\nfoo\n" }
diff --git a/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/RenameProcessorTests.java b/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/RenameProcessorTests.java
index 758e5eb997297..bf35918ad6e24 100644
--- a/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/RenameProcessorTests.java
+++ b/modules/ingest-common/src/test/java/org/elasticsearch/ingest/common/RenameProcessorTests.java
@@ -128,7 +128,7 @@ public void testRenameExistingFieldNullValue() throws Exception {
IngestDocument ingestDocument = RandomDocumentPicks.randomIngestDocument(random(), new HashMap<>());
String fieldName = RandomDocumentPicks.randomFieldName(random());
ingestDocument.setFieldValue(fieldName, null);
- String newFieldName = RandomDocumentPicks.randomFieldName(random());
+ String newFieldName = randomValueOtherThanMany(ingestDocument::hasField, () -> RandomDocumentPicks.randomFieldName(random()));
Processor processor = new RenameProcessor(randomAlphaOfLength(10), fieldName, newFieldName, false);
processor.execute(ingestDocument);
assertThat(ingestDocument.hasField(fieldName), equalTo(false));
diff --git a/modules/lang-painless/src/main/antlr/PainlessParser.g4 b/modules/lang-painless/src/main/antlr/PainlessParser.g4
index bfa4ee28dcc88..5292b4d195056 100644
--- a/modules/lang-painless/src/main/antlr/PainlessParser.g4
+++ b/modules/lang-painless/src/main/antlr/PainlessParser.g4
@@ -22,7 +22,7 @@ parser grammar PainlessParser;
options { tokenVocab=PainlessLexer; }
source
- : function* statement* EOF
+ : function* statement* dstatement? EOF
;
function
@@ -33,23 +33,31 @@ parameters
: LP ( decltype ID ( COMMA decltype ID )* )? RP
;
+statement
+ : rstatement
+ | dstatement SEMICOLON
+ ;
+
// Note we use a predicate on the if/else case here to prevent the
// "dangling-else" ambiguity by forcing the 'else' token to be consumed
// as soon as one is found. See (https://en.wikipedia.org/wiki/Dangling_else).
-statement
+rstatement
: IF LP expression RP trailer ( ELSE trailer | { _input.LA(1) != ELSE }? ) # if
| WHILE LP expression RP ( trailer | empty ) # while
- | DO block WHILE LP expression RP delimiter # do
| FOR LP initializer? SEMICOLON expression? SEMICOLON afterthought? RP ( trailer | empty ) # for
| FOR LP decltype ID COLON expression RP trailer # each
| FOR LP ID IN expression RP trailer # ineach
- | declaration delimiter # decl
- | CONTINUE delimiter # continue
- | BREAK delimiter # break
- | RETURN expression delimiter # return
| TRY block trap+ # try
- | THROW expression delimiter # throw
- | expression delimiter # expr
+ ;
+
+dstatement
+ : DO block WHILE LP expression RP # do
+ | declaration # decl
+ | CONTINUE # continue
+ | BREAK # break
+ | RETURN expression # return
+ | THROW expression # throw
+ | expression # expr
;
trailer
@@ -58,7 +66,7 @@ trailer
;
block
- : LBRACK statement* RBRACK
+ : LBRACK statement* dstatement? RBRACK
;
empty
@@ -90,11 +98,6 @@ trap
: CATCH LP TYPE ID RP block
;
-delimiter
- : SEMICOLON
- | EOF
- ;
-
expression
: unary # single
| expression ( MUL | DIV | REM ) expression # binary
@@ -169,8 +172,8 @@ braceaccess
;
arrayinitializer
- : NEW TYPE ( LBRACE expression RBRACE )+ ( postdot postfix* )? # newstandardarray
- | NEW TYPE LBRACE RBRACE LBRACK ( expression ( COMMA expression )* )? SEMICOLON? RBRACK postfix* # newinitializedarray
+ : NEW TYPE ( LBRACE expression RBRACE )+ ( postdot postfix* )? # newstandardarray
+ | NEW TYPE LBRACE RBRACE LBRACK ( expression ( COMMA expression )* )? RBRACK postfix* # newinitializedarray
;
listinitializer
@@ -206,10 +209,8 @@ lamtype
;
funcref
- : TYPE REF ID # classfuncref // reference to a static or instance method,
- // e.g. ArrayList::size or Integer::compare
- | decltype REF NEW # constructorfuncref // reference to a constructor, e.g. ArrayList::new
- | ID REF ID # capturingfuncref // reference to an instance method, e.g. object::toString
- // currently limited to capture of a simple variable (id).
- | THIS REF ID # localfuncref // reference to a local function, e.g. this::myfunc
+ : TYPE REF ID # classfuncref
+ | decltype REF NEW # constructorfuncref
+ | ID REF ID # capturingfuncref
+ | THIS REF ID # localfuncref
;
diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/PainlessExecuteAction.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/PainlessExecuteAction.java
new file mode 100644
index 0000000000000..aa650a37c4fa2
--- /dev/null
+++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/PainlessExecuteAction.java
@@ -0,0 +1,338 @@
+/*
+ * Licensed to Elasticsearch under one or more contributor
+ * license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright
+ * ownership. Elasticsearch licenses this file to you under
+ * the Apache License, Version 2.0 (the "License"); you may
+ * not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.elasticsearch.painless;
+
+import org.elasticsearch.action.Action;
+import org.elasticsearch.action.ActionListener;
+import org.elasticsearch.action.ActionRequest;
+import org.elasticsearch.action.ActionRequestBuilder;
+import org.elasticsearch.action.ActionRequestValidationException;
+import org.elasticsearch.action.ActionResponse;
+import org.elasticsearch.action.support.ActionFilters;
+import org.elasticsearch.action.support.HandledTransportAction;
+import org.elasticsearch.client.ElasticsearchClient;
+import org.elasticsearch.client.node.NodeClient;
+import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver;
+import org.elasticsearch.common.ParseField;
+import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.io.stream.StreamInput;
+import org.elasticsearch.common.io.stream.StreamOutput;
+import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.common.xcontent.ConstructingObjectParser;
+import org.elasticsearch.common.xcontent.ToXContent;
+import org.elasticsearch.common.xcontent.ToXContentObject;
+import org.elasticsearch.common.xcontent.XContentBuilder;
+import org.elasticsearch.common.xcontent.XContentParser;
+import org.elasticsearch.rest.BaseRestHandler;
+import org.elasticsearch.rest.BytesRestResponse;
+import org.elasticsearch.rest.RestController;
+import org.elasticsearch.rest.RestRequest;
+import org.elasticsearch.rest.RestResponse;
+import org.elasticsearch.rest.action.RestBuilderListener;
+import org.elasticsearch.script.Script;
+import org.elasticsearch.script.ScriptContext;
+import org.elasticsearch.script.ScriptService;
+import org.elasticsearch.script.ScriptType;
+import org.elasticsearch.threadpool.ThreadPool;
+import org.elasticsearch.transport.TransportService;
+
+import java.io.IOException;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Objects;
+
+import static org.elasticsearch.action.ValidateActions.addValidationError;
+import static org.elasticsearch.rest.RestRequest.Method.GET;
+import static org.elasticsearch.rest.RestRequest.Method.POST;
+import static org.elasticsearch.rest.RestStatus.OK;
+
+public class PainlessExecuteAction extends Action {
+
+ static final PainlessExecuteAction INSTANCE = new PainlessExecuteAction();
+ private static final String NAME = "cluster:admin/scripts/painless/execute";
+
+ private PainlessExecuteAction() {
+ super(NAME);
+ }
+
+ @Override
+ public RequestBuilder newRequestBuilder(ElasticsearchClient client) {
+ return new RequestBuilder(client);
+ }
+
+ @Override
+ public Response newResponse() {
+ return new Response();
+ }
+
+ public static class Request extends ActionRequest implements ToXContent {
+
+ private static final ParseField SCRIPT_FIELD = new ParseField("script");
+ private static final ParseField CONTEXT_FIELD = new ParseField("context");
+ private static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>(
+ "painless_execute_request", args -> new Request((Script) args[0], (SupportedContext) args[1]));
+
+ static {
+ PARSER.declareObject(ConstructingObjectParser.constructorArg(), (p, c) -> Script.parse(p), SCRIPT_FIELD);
+ PARSER.declareObject(ConstructingObjectParser.optionalConstructorArg(), (p, c) -> {
+ // For now only accept an empty json object:
+ XContentParser.Token token = p.nextToken();
+ assert token == XContentParser.Token.FIELD_NAME;
+ String contextType = p.currentName();
+ token = p.nextToken();
+ assert token == XContentParser.Token.START_OBJECT;
+ token = p.nextToken();
+ assert token == XContentParser.Token.END_OBJECT;
+ token = p.nextToken();
+ assert token == XContentParser.Token.END_OBJECT;
+ return SupportedContext.valueOf(contextType.toUpperCase(Locale.ROOT));
+ }, CONTEXT_FIELD);
+ }
+
+ private Script script;
+ private SupportedContext context;
+
+ static Request parse(XContentParser parser) throws IOException {
+ return PARSER.parse(parser, null);
+ }
+
+ Request(Script script, SupportedContext context) {
+ this.script = Objects.requireNonNull(script);
+ this.context = context != null ? context : SupportedContext.PAINLESS_TEST;
+ }
+
+ Request() {
+ }
+
+ public Script getScript() {
+ return script;
+ }
+
+ public SupportedContext getContext() {
+ return context;
+ }
+
+ @Override
+ public ActionRequestValidationException validate() {
+ ActionRequestValidationException validationException = null;
+ if (script.getType() != ScriptType.INLINE) {
+ validationException = addValidationError("only inline scripts are supported", validationException);
+ }
+ return validationException;
+ }
+
+ @Override
+ public void readFrom(StreamInput in) throws IOException {
+ super.readFrom(in);
+ script = new Script(in);
+ context = SupportedContext.fromId(in.readByte());
+ }
+
+ @Override
+ public void writeTo(StreamOutput out) throws IOException {
+ super.writeTo(out);
+ script.writeTo(out);
+ out.writeByte(context.id);
+ }
+
+ // For testing only:
+ @Override
+ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
+ builder.field(SCRIPT_FIELD.getPreferredName(), script);
+ builder.startObject(CONTEXT_FIELD.getPreferredName());
+ {
+ builder.startObject(context.name());
+ builder.endObject();
+ }
+ builder.endObject();
+ return builder;
+ }
+
+ @Override
+ public boolean equals(Object o) {
+ if (this == o) return true;
+ if (o == null || getClass() != o.getClass()) return false;
+ Request request = (Request) o;
+ return Objects.equals(script, request.script) &&
+ context == request.context;
+ }
+
+ @Override
+ public int hashCode() {
+ return Objects.hash(script, context);
+ }
+
+ public enum SupportedContext {
+
+ PAINLESS_TEST((byte) 0);
+
+ private final byte id;
+
+ SupportedContext(byte id) {
+ this.id = id;
+ }
+
+ public static SupportedContext fromId(byte id) {
+ switch (id) {
+ case 0:
+ return PAINLESS_TEST;
+ default:
+ throw new IllegalArgumentException("unknown context [" + id + "]");
+ }
+ }
+ }
+
+ }
+
+ public static class RequestBuilder extends ActionRequestBuilder {
+
+ RequestBuilder(ElasticsearchClient client) {
+ super(client, INSTANCE, new Request());
+ }
+ }
+
+ public static class Response extends ActionResponse implements ToXContentObject {
+
+ private Object result;
+
+ Response() {}
+
+ Response(Object result) {
+ this.result = result;
+ }
+
+ public Object getResult() {
+ return result;
+ }
+
+ @Override
+ public void readFrom(StreamInput in) throws IOException {
+ super.readFrom(in);
+ result = in.readGenericValue();
+ }
+
+ @Override
+ public void writeTo(StreamOutput out) throws IOException {
+ super.writeTo(out);
+ out.writeGenericValue(result);
+ }
+
+ @Override
+ public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
+ builder.startObject();
+ builder.field("result", result);
+ return builder.endObject();
+ }
+
+ @Override
+ public boolean equals(Object o) {
+ if (this == o) return true;
+ if (o == null || getClass() != o.getClass()) return false;
+ Response response = (Response) o;
+ return Objects.equals(result, response.result);
+ }
+
+ @Override
+ public int hashCode() {
+ return Objects.hash(result);
+ }
+ }
+
+ public abstract static class PainlessTestScript {
+
+ private final Map params;
+
+ public PainlessTestScript(Map params) {
+ this.params = params;
+ }
+
+ /** Return the parameters for this script. */
+ public Map getParams() {
+ return params;
+ }
+
+ public abstract Object execute();
+
+ public interface Factory {
+
+ PainlessTestScript newInstance(Map params);
+
+ }
+
+ public static final String[] PARAMETERS = {};
+ public static final ScriptContext CONTEXT = new ScriptContext<>("painless_test", Factory.class);
+
+ }
+
+ public static class TransportAction extends HandledTransportAction {
+
+
+ private final ScriptService scriptService;
+
+ @Inject
+ public TransportAction(Settings settings, ThreadPool threadPool, TransportService transportService,
+ ActionFilters actionFilters, IndexNameExpressionResolver indexNameExpressionResolver,
+ ScriptService scriptService) {
+ super(settings, NAME, threadPool, transportService, actionFilters, indexNameExpressionResolver, Request::new);
+ this.scriptService = scriptService;
+ }
+ @Override
+ protected void doExecute(Request request, ActionListener listener) {
+ switch (request.context) {
+ case PAINLESS_TEST:
+ PainlessTestScript.Factory factory = scriptService.compile(request.script, PainlessTestScript.CONTEXT);
+ PainlessTestScript painlessTestScript = factory.newInstance(request.script.getParams());
+ String result = Objects.toString(painlessTestScript.execute());
+ listener.onResponse(new Response(result));
+ break;
+ default:
+ throw new UnsupportedOperationException("unsupported context [" + request.context + "]");
+ }
+ }
+
+ }
+
+ static class RestAction extends BaseRestHandler {
+
+ RestAction(Settings settings, RestController controller) {
+ super(settings);
+ controller.registerHandler(GET, "/_scripts/painless/_execute", this);
+ controller.registerHandler(POST, "/_scripts/painless/_execute", this);
+ }
+
+ @Override
+ public String getName() {
+ return "_scripts_painless_execute";
+ }
+
+ @Override
+ protected RestChannelConsumer prepareRequest(RestRequest restRequest, NodeClient client) throws IOException {
+ final Request request = Request.parse(restRequest.contentOrSourceParamParser());
+ return channel -> client.executeLocally(INSTANCE, request, new RestBuilderListener(channel) {
+ @Override
+ public RestResponse buildResponse(Response response, XContentBuilder builder) throws Exception {
+ response.toXContent(builder, ToXContent.EMPTY_PARAMS);
+ return new BytesRestResponse(OK, builder);
+ }
+ });
+ }
+ }
+
+}
diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/PainlessPlugin.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/PainlessPlugin.java
index 795d81bb6e058..0364ad667efc7 100644
--- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/PainlessPlugin.java
+++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/PainlessPlugin.java
@@ -20,28 +20,40 @@
package org.elasticsearch.painless;
+import org.elasticsearch.action.ActionRequest;
+import org.elasticsearch.action.ActionResponse;
+import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver;
+import org.elasticsearch.cluster.node.DiscoveryNodes;
+import org.elasticsearch.common.settings.ClusterSettings;
+import org.elasticsearch.common.settings.IndexScopedSettings;
import org.elasticsearch.common.settings.Setting;
import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.common.settings.SettingsFilter;
import org.elasticsearch.painless.spi.PainlessExtension;
import org.elasticsearch.painless.spi.Whitelist;
+import org.elasticsearch.plugins.ActionPlugin;
import org.elasticsearch.plugins.ExtensiblePlugin;
import org.elasticsearch.plugins.Plugin;
import org.elasticsearch.plugins.ScriptPlugin;
+import org.elasticsearch.rest.RestController;
+import org.elasticsearch.rest.RestHandler;
import org.elasticsearch.script.ScriptContext;
import org.elasticsearch.script.ScriptEngine;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
+import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.ServiceLoader;
+import java.util.function.Supplier;
/**
* Registers Painless as a plugin.
*/
-public final class PainlessPlugin extends Plugin implements ScriptPlugin, ExtensiblePlugin {
+public final class PainlessPlugin extends Plugin implements ScriptPlugin, ExtensiblePlugin, ActionPlugin {
private final Map, List> extendedWhitelists = new HashMap<>();
@@ -74,4 +86,24 @@ public void reloadSPI(ClassLoader loader) {
}
}
}
+
+ @SuppressWarnings("rawtypes")
+ public List getContexts() {
+ return Collections.singletonList(PainlessExecuteAction.PainlessTestScript.CONTEXT);
+ }
+
+ @Override
+ public List> getActions() {
+ return Collections.singletonList(
+ new ActionHandler<>(PainlessExecuteAction.INSTANCE, PainlessExecuteAction.TransportAction.class)
+ );
+ }
+
+ @Override
+ public List getRestHandlers(Settings settings, RestController restController, ClusterSettings clusterSettings,
+ IndexScopedSettings indexScopedSettings, SettingsFilter settingsFilter,
+ IndexNameExpressionResolver indexNameExpressionResolver,
+ Supplier nodesInCluster) {
+ return Collections.singletonList(new PainlessExecuteAction.RestAction(settings, restController));
+ }
}
diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/EnhancedPainlessLexer.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/EnhancedPainlessLexer.java
index 506ac8fcdecdb..adef4d3642571 100644
--- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/EnhancedPainlessLexer.java
+++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/EnhancedPainlessLexer.java
@@ -44,8 +44,7 @@ final class EnhancedPainlessLexer extends PainlessLexer {
private final String sourceName;
private final Definition definition;
- private Token stashedNext = null;
- private Token previous = null;
+ private Token current = null;
EnhancedPainlessLexer(CharStream charStream, String sourceName, Definition definition) {
super(charStream);
@@ -53,27 +52,10 @@ final class EnhancedPainlessLexer extends PainlessLexer {
this.definition = definition;
}
- public Token getPreviousToken() {
- return previous;
- }
-
@Override
public Token nextToken() {
- if (stashedNext != null) {
- previous = stashedNext;
- stashedNext = null;
- return previous;
- }
- Token next = super.nextToken();
- if (insertSemicolon(previous, next)) {
- stashedNext = next;
- previous = _factory.create(new Pair(this, _input), PainlessLexer.SEMICOLON, ";",
- Lexer.DEFAULT_TOKEN_CHANNEL, next.getStartIndex(), next.getStopIndex(), next.getLine(), next.getCharPositionInLine());
- return previous;
- } else {
- previous = next;
- return next;
- }
+ current = super.nextToken();
+ return current;
}
@Override
@@ -101,7 +83,7 @@ protected boolean isSimpleType(String name) {
@Override
protected boolean slashIsRegex() {
- Token lastToken = getPreviousToken();
+ Token lastToken = current;
if (lastToken == null) {
return true;
}
@@ -120,18 +102,4 @@ protected boolean slashIsRegex() {
return true;
}
}
-
- private static boolean insertSemicolon(Token previous, Token next) {
- if (previous == null || next.getType() != PainlessLexer.RBRACK) {
- return false;
- }
- switch (previous.getType()) {
- case PainlessLexer.RBRACK: // };} would be weird!
- case PainlessLexer.SEMICOLON: // already have a semicolon, no need to add one
- case PainlessLexer.LBRACK: // empty blocks don't need a semicolon
- return false;
- default:
- return true;
- }
- }
}
diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParser.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParser.java
index 528a8a3d851c6..bba53d650ad32 100644
--- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParser.java
+++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParser.java
@@ -30,21 +30,21 @@ class PainlessParser extends Parser {
ID=82, DOTINTEGER=83, DOTID=84;
public static final int
RULE_source = 0, RULE_function = 1, RULE_parameters = 2, RULE_statement = 3,
- RULE_trailer = 4, RULE_block = 5, RULE_empty = 6, RULE_initializer = 7,
- RULE_afterthought = 8, RULE_declaration = 9, RULE_decltype = 10, RULE_declvar = 11,
- RULE_trap = 12, RULE_delimiter = 13, RULE_expression = 14, RULE_unary = 15,
- RULE_chain = 16, RULE_primary = 17, RULE_postfix = 18, RULE_postdot = 19,
- RULE_callinvoke = 20, RULE_fieldaccess = 21, RULE_braceaccess = 22, RULE_arrayinitializer = 23,
- RULE_listinitializer = 24, RULE_mapinitializer = 25, RULE_maptoken = 26,
- RULE_arguments = 27, RULE_argument = 28, RULE_lambda = 29, RULE_lamtype = 30,
- RULE_funcref = 31;
+ RULE_rstatement = 4, RULE_dstatement = 5, RULE_trailer = 6, RULE_block = 7,
+ RULE_empty = 8, RULE_initializer = 9, RULE_afterthought = 10, RULE_declaration = 11,
+ RULE_decltype = 12, RULE_declvar = 13, RULE_trap = 14, RULE_expression = 15,
+ RULE_unary = 16, RULE_chain = 17, RULE_primary = 18, RULE_postfix = 19,
+ RULE_postdot = 20, RULE_callinvoke = 21, RULE_fieldaccess = 22, RULE_braceaccess = 23,
+ RULE_arrayinitializer = 24, RULE_listinitializer = 25, RULE_mapinitializer = 26,
+ RULE_maptoken = 27, RULE_arguments = 28, RULE_argument = 29, RULE_lambda = 30,
+ RULE_lamtype = 31, RULE_funcref = 32;
public static final String[] ruleNames = {
- "source", "function", "parameters", "statement", "trailer", "block", "empty",
- "initializer", "afterthought", "declaration", "decltype", "declvar", "trap",
- "delimiter", "expression", "unary", "chain", "primary", "postfix", "postdot",
- "callinvoke", "fieldaccess", "braceaccess", "arrayinitializer", "listinitializer",
- "mapinitializer", "maptoken", "arguments", "argument", "lambda", "lamtype",
- "funcref"
+ "source", "function", "parameters", "statement", "rstatement", "dstatement",
+ "trailer", "block", "empty", "initializer", "afterthought", "declaration",
+ "decltype", "declvar", "trap", "expression", "unary", "chain", "primary",
+ "postfix", "postdot", "callinvoke", "fieldaccess", "braceaccess", "arrayinitializer",
+ "listinitializer", "mapinitializer", "maptoken", "arguments", "argument",
+ "lambda", "lamtype", "funcref"
};
private static final String[] _LITERAL_NAMES = {
@@ -133,6 +133,9 @@ public List statement() {
public StatementContext statement(int i) {
return getRuleContext(StatementContext.class,i);
}
+ public DstatementContext dstatement() {
+ return getRuleContext(DstatementContext.class,0);
+ }
public SourceContext(ParserRuleContext parent, int invokingState) {
super(parent, invokingState);
}
@@ -152,37 +155,48 @@ public final SourceContext source() throws RecognitionException {
int _alt;
enterOuterAlt(_localctx, 1);
{
- setState(67);
+ setState(69);
_errHandler.sync(this);
_alt = getInterpreter().adaptivePredict(_input,0,_ctx);
while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
if ( _alt==1 ) {
{
{
- setState(64);
+ setState(66);
function();
}
}
}
- setState(69);
+ setState(71);
_errHandler.sync(this);
_alt = getInterpreter().adaptivePredict(_input,0,_ctx);
}
- setState(73);
+ setState(75);
_errHandler.sync(this);
+ _alt = getInterpreter().adaptivePredict(_input,1,_ctx);
+ while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
+ if ( _alt==1 ) {
+ {
+ {
+ setState(72);
+ statement();
+ }
+ }
+ }
+ setState(77);
+ _errHandler.sync(this);
+ _alt = getInterpreter().adaptivePredict(_input,1,_ctx);
+ }
+ setState(79);
_la = _input.LA(1);
- while ((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LBRACE) | (1L << LP) | (1L << IF) | (1L << WHILE) | (1L << DO) | (1L << FOR) | (1L << CONTINUE) | (1L << BREAK) | (1L << RETURN) | (1L << NEW) | (1L << TRY) | (1L << THROW) | (1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB) | (1L << INCR) | (1L << DECR))) != 0) || ((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)) | (1L << (STRING - 72)) | (1L << (REGEX - 72)) | (1L << (TRUE - 72)) | (1L << (FALSE - 72)) | (1L << (NULL - 72)) | (1L << (TYPE - 72)) | (1L << (ID - 72)))) != 0)) {
+ if ((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LBRACE) | (1L << LP) | (1L << DO) | (1L << CONTINUE) | (1L << BREAK) | (1L << RETURN) | (1L << NEW) | (1L << THROW) | (1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB) | (1L << INCR) | (1L << DECR))) != 0) || ((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)) | (1L << (STRING - 72)) | (1L << (REGEX - 72)) | (1L << (TRUE - 72)) | (1L << (FALSE - 72)) | (1L << (NULL - 72)) | (1L << (TYPE - 72)) | (1L << (ID - 72)))) != 0)) {
{
- {
- setState(70);
- statement();
- }
+ setState(78);
+ dstatement();
}
- setState(75);
- _errHandler.sync(this);
- _la = _input.LA(1);
}
- setState(76);
+
+ setState(81);
match(EOF);
}
}
@@ -225,13 +239,13 @@ public final FunctionContext function() throws RecognitionException {
try {
enterOuterAlt(_localctx, 1);
{
- setState(78);
+ setState(83);
decltype();
- setState(79);
+ setState(84);
match(ID);
- setState(80);
+ setState(85);
parameters();
- setState(81);
+ setState(86);
block();
}
}
@@ -281,38 +295,38 @@ public final ParametersContext parameters() throws RecognitionException {
try {
enterOuterAlt(_localctx, 1);
{
- setState(83);
+ setState(88);
match(LP);
- setState(95);
+ setState(100);
_la = _input.LA(1);
if (_la==TYPE) {
{
- setState(84);
+ setState(89);
decltype();
- setState(85);
+ setState(90);
match(ID);
- setState(92);
+ setState(97);
_errHandler.sync(this);
_la = _input.LA(1);
while (_la==COMMA) {
{
{
- setState(86);
+ setState(91);
match(COMMA);
- setState(87);
+ setState(92);
decltype();
- setState(88);
+ setState(93);
match(ID);
}
}
- setState(94);
+ setState(99);
_errHandler.sync(this);
_la = _input.LA(1);
}
}
}
- setState(97);
+ setState(102);
match(RP);
}
}
@@ -328,43 +342,100 @@ public final ParametersContext parameters() throws RecognitionException {
}
public static class StatementContext extends ParserRuleContext {
+ public RstatementContext rstatement() {
+ return getRuleContext(RstatementContext.class,0);
+ }
+ public DstatementContext dstatement() {
+ return getRuleContext(DstatementContext.class,0);
+ }
+ public TerminalNode SEMICOLON() { return getToken(PainlessParser.SEMICOLON, 0); }
public StatementContext(ParserRuleContext parent, int invokingState) {
super(parent, invokingState);
}
@Override public int getRuleIndex() { return RULE_statement; }
-
- public StatementContext() { }
- public void copyFrom(StatementContext ctx) {
- super.copyFrom(ctx);
+ @Override
+ public T accept(ParseTreeVisitor extends T> visitor) {
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitStatement(this);
+ else return visitor.visitChildren(this);
}
}
- public static class DeclContext extends StatementContext {
- public DeclarationContext declaration() {
- return getRuleContext(DeclarationContext.class,0);
+
+ public final StatementContext statement() throws RecognitionException {
+ StatementContext _localctx = new StatementContext(_ctx, getState());
+ enterRule(_localctx, 6, RULE_statement);
+ try {
+ setState(108);
+ switch (_input.LA(1)) {
+ case IF:
+ case WHILE:
+ case FOR:
+ case TRY:
+ enterOuterAlt(_localctx, 1);
+ {
+ setState(104);
+ rstatement();
+ }
+ break;
+ case LBRACE:
+ case LP:
+ case DO:
+ case CONTINUE:
+ case BREAK:
+ case RETURN:
+ case NEW:
+ case THROW:
+ case BOOLNOT:
+ case BWNOT:
+ case ADD:
+ case SUB:
+ case INCR:
+ case DECR:
+ case OCTAL:
+ case HEX:
+ case INTEGER:
+ case DECIMAL:
+ case STRING:
+ case REGEX:
+ case TRUE:
+ case FALSE:
+ case NULL:
+ case TYPE:
+ case ID:
+ enterOuterAlt(_localctx, 2);
+ {
+ setState(105);
+ dstatement();
+ setState(106);
+ match(SEMICOLON);
+ }
+ break;
+ default:
+ throw new NoViableAltException(this);
+ }
}
- public DelimiterContext delimiter() {
- return getRuleContext(DelimiterContext.class,0);
+ catch (RecognitionException re) {
+ _localctx.exception = re;
+ _errHandler.reportError(this, re);
+ _errHandler.recover(this, re);
}
- public DeclContext(StatementContext ctx) { copyFrom(ctx); }
- @Override
- public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitDecl(this);
- else return visitor.visitChildren(this);
+ finally {
+ exitRule();
}
+ return _localctx;
}
- public static class BreakContext extends StatementContext {
- public TerminalNode BREAK() { return getToken(PainlessParser.BREAK, 0); }
- public DelimiterContext delimiter() {
- return getRuleContext(DelimiterContext.class,0);
+
+ public static class RstatementContext extends ParserRuleContext {
+ public RstatementContext(ParserRuleContext parent, int invokingState) {
+ super(parent, invokingState);
}
- public BreakContext(StatementContext ctx) { copyFrom(ctx); }
- @Override
- public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitBreak(this);
- else return visitor.visitChildren(this);
+ @Override public int getRuleIndex() { return RULE_rstatement; }
+
+ public RstatementContext() { }
+ public void copyFrom(RstatementContext ctx) {
+ super.copyFrom(ctx);
}
}
- public static class ForContext extends StatementContext {
+ public static class ForContext extends RstatementContext {
public TerminalNode FOR() { return getToken(PainlessParser.FOR, 0); }
public TerminalNode LP() { return getToken(PainlessParser.LP, 0); }
public List SEMICOLON() { return getTokens(PainlessParser.SEMICOLON); }
@@ -387,35 +458,32 @@ public ExpressionContext expression() {
public AfterthoughtContext afterthought() {
return getRuleContext(AfterthoughtContext.class,0);
}
- public ForContext(StatementContext ctx) { copyFrom(ctx); }
+ public ForContext(RstatementContext ctx) { copyFrom(ctx); }
@Override
public T accept(ParseTreeVisitor extends T> visitor) {
if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitFor(this);
else return visitor.visitChildren(this);
}
}
- public static class DoContext extends StatementContext {
- public TerminalNode DO() { return getToken(PainlessParser.DO, 0); }
+ public static class TryContext extends RstatementContext {
+ public TerminalNode TRY() { return getToken(PainlessParser.TRY, 0); }
public BlockContext block() {
return getRuleContext(BlockContext.class,0);
}
- public TerminalNode WHILE() { return getToken(PainlessParser.WHILE, 0); }
- public TerminalNode LP() { return getToken(PainlessParser.LP, 0); }
- public ExpressionContext expression() {
- return getRuleContext(ExpressionContext.class,0);
+ public List trap() {
+ return getRuleContexts(TrapContext.class);
}
- public TerminalNode RP() { return getToken(PainlessParser.RP, 0); }
- public DelimiterContext delimiter() {
- return getRuleContext(DelimiterContext.class,0);
+ public TrapContext trap(int i) {
+ return getRuleContext(TrapContext.class,i);
}
- public DoContext(StatementContext ctx) { copyFrom(ctx); }
+ public TryContext(RstatementContext ctx) { copyFrom(ctx); }
@Override
public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitDo(this);
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitTry(this);
else return visitor.visitChildren(this);
}
}
- public static class WhileContext extends StatementContext {
+ public static class WhileContext extends RstatementContext {
public TerminalNode WHILE() { return getToken(PainlessParser.WHILE, 0); }
public TerminalNode LP() { return getToken(PainlessParser.LP, 0); }
public ExpressionContext expression() {
@@ -428,14 +496,14 @@ public TrailerContext trailer() {
public EmptyContext empty() {
return getRuleContext(EmptyContext.class,0);
}
- public WhileContext(StatementContext ctx) { copyFrom(ctx); }
+ public WhileContext(RstatementContext ctx) { copyFrom(ctx); }
@Override
public T accept(ParseTreeVisitor extends T> visitor) {
if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitWhile(this);
else return visitor.visitChildren(this);
}
}
- public static class IneachContext extends StatementContext {
+ public static class IneachContext extends RstatementContext {
public TerminalNode FOR() { return getToken(PainlessParser.FOR, 0); }
public TerminalNode LP() { return getToken(PainlessParser.LP, 0); }
public TerminalNode ID() { return getToken(PainlessParser.ID, 0); }
@@ -447,95 +515,14 @@ public ExpressionContext expression() {
public TrailerContext trailer() {
return getRuleContext(TrailerContext.class,0);
}
- public IneachContext(StatementContext ctx) { copyFrom(ctx); }
+ public IneachContext(RstatementContext ctx) { copyFrom(ctx); }
@Override
public T accept(ParseTreeVisitor extends T> visitor) {
if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitIneach(this);
else return visitor.visitChildren(this);
}
}
- public static class EachContext extends StatementContext {
- public TerminalNode FOR() { return getToken(PainlessParser.FOR, 0); }
- public TerminalNode LP() { return getToken(PainlessParser.LP, 0); }
- public DecltypeContext decltype() {
- return getRuleContext(DecltypeContext.class,0);
- }
- public TerminalNode ID() { return getToken(PainlessParser.ID, 0); }
- public TerminalNode COLON() { return getToken(PainlessParser.COLON, 0); }
- public ExpressionContext expression() {
- return getRuleContext(ExpressionContext.class,0);
- }
- public TerminalNode RP() { return getToken(PainlessParser.RP, 0); }
- public TrailerContext trailer() {
- return getRuleContext(TrailerContext.class,0);
- }
- public EachContext(StatementContext ctx) { copyFrom(ctx); }
- @Override
- public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitEach(this);
- else return visitor.visitChildren(this);
- }
- }
- public static class ThrowContext extends StatementContext {
- public TerminalNode THROW() { return getToken(PainlessParser.THROW, 0); }
- public ExpressionContext expression() {
- return getRuleContext(ExpressionContext.class,0);
- }
- public DelimiterContext delimiter() {
- return getRuleContext(DelimiterContext.class,0);
- }
- public ThrowContext(StatementContext ctx) { copyFrom(ctx); }
- @Override
- public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitThrow(this);
- else return visitor.visitChildren(this);
- }
- }
- public static class ContinueContext extends StatementContext {
- public TerminalNode CONTINUE() { return getToken(PainlessParser.CONTINUE, 0); }
- public DelimiterContext delimiter() {
- return getRuleContext(DelimiterContext.class,0);
- }
- public ContinueContext(StatementContext ctx) { copyFrom(ctx); }
- @Override
- public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitContinue(this);
- else return visitor.visitChildren(this);
- }
- }
- public static class TryContext extends StatementContext {
- public TerminalNode TRY() { return getToken(PainlessParser.TRY, 0); }
- public BlockContext block() {
- return getRuleContext(BlockContext.class,0);
- }
- public List trap() {
- return getRuleContexts(TrapContext.class);
- }
- public TrapContext trap(int i) {
- return getRuleContext(TrapContext.class,i);
- }
- public TryContext(StatementContext ctx) { copyFrom(ctx); }
- @Override
- public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitTry(this);
- else return visitor.visitChildren(this);
- }
- }
- public static class ExprContext extends StatementContext {
- public ExpressionContext expression() {
- return getRuleContext(ExpressionContext.class,0);
- }
- public DelimiterContext delimiter() {
- return getRuleContext(DelimiterContext.class,0);
- }
- public ExprContext(StatementContext ctx) { copyFrom(ctx); }
- @Override
- public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitExpr(this);
- else return visitor.visitChildren(this);
- }
- }
- public static class IfContext extends StatementContext {
+ public static class IfContext extends RstatementContext {
public TerminalNode IF() { return getToken(PainlessParser.IF, 0); }
public TerminalNode LP() { return getToken(PainlessParser.LP, 0); }
public ExpressionContext expression() {
@@ -549,66 +536,73 @@ public TrailerContext trailer(int i) {
return getRuleContext(TrailerContext.class,i);
}
public TerminalNode ELSE() { return getToken(PainlessParser.ELSE, 0); }
- public IfContext(StatementContext ctx) { copyFrom(ctx); }
+ public IfContext(RstatementContext ctx) { copyFrom(ctx); }
@Override
public T accept(ParseTreeVisitor extends T> visitor) {
if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitIf(this);
else return visitor.visitChildren(this);
}
}
- public static class ReturnContext extends StatementContext {
- public TerminalNode RETURN() { return getToken(PainlessParser.RETURN, 0); }
+ public static class EachContext extends RstatementContext {
+ public TerminalNode FOR() { return getToken(PainlessParser.FOR, 0); }
+ public TerminalNode LP() { return getToken(PainlessParser.LP, 0); }
+ public DecltypeContext decltype() {
+ return getRuleContext(DecltypeContext.class,0);
+ }
+ public TerminalNode ID() { return getToken(PainlessParser.ID, 0); }
+ public TerminalNode COLON() { return getToken(PainlessParser.COLON, 0); }
public ExpressionContext expression() {
return getRuleContext(ExpressionContext.class,0);
}
- public DelimiterContext delimiter() {
- return getRuleContext(DelimiterContext.class,0);
+ public TerminalNode RP() { return getToken(PainlessParser.RP, 0); }
+ public TrailerContext trailer() {
+ return getRuleContext(TrailerContext.class,0);
}
- public ReturnContext(StatementContext ctx) { copyFrom(ctx); }
+ public EachContext(RstatementContext ctx) { copyFrom(ctx); }
@Override
public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitReturn(this);
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitEach(this);
else return visitor.visitChildren(this);
}
}
- public final StatementContext statement() throws RecognitionException {
- StatementContext _localctx = new StatementContext(_ctx, getState());
- enterRule(_localctx, 6, RULE_statement);
+ public final RstatementContext rstatement() throws RecognitionException {
+ RstatementContext _localctx = new RstatementContext(_ctx, getState());
+ enterRule(_localctx, 8, RULE_rstatement);
int _la;
try {
int _alt;
- setState(185);
+ setState(170);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,11,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,13,_ctx) ) {
case 1:
_localctx = new IfContext(_localctx);
enterOuterAlt(_localctx, 1);
{
- setState(99);
+ setState(110);
match(IF);
- setState(100);
+ setState(111);
match(LP);
- setState(101);
+ setState(112);
expression(0);
- setState(102);
+ setState(113);
match(RP);
- setState(103);
+ setState(114);
trailer();
- setState(107);
+ setState(118);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,4,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,6,_ctx) ) {
case 1:
{
- setState(104);
+ setState(115);
match(ELSE);
- setState(105);
+ setState(116);
trailer();
}
break;
case 2:
{
- setState(106);
+ setState(117);
if (!( _input.LA(1) != ELSE )) throw new FailedPredicateException(this, " _input.LA(1) != ELSE ");
}
break;
@@ -619,15 +613,15 @@ public final StatementContext statement() throws RecognitionException {
_localctx = new WhileContext(_localctx);
enterOuterAlt(_localctx, 2);
{
- setState(109);
+ setState(120);
match(WHILE);
- setState(110);
+ setState(121);
match(LP);
- setState(111);
+ setState(122);
expression(0);
- setState(112);
+ setState(123);
match(RP);
- setState(115);
+ setState(126);
switch (_input.LA(1)) {
case LBRACK:
case LBRACE:
@@ -660,13 +654,13 @@ public final StatementContext statement() throws RecognitionException {
case TYPE:
case ID:
{
- setState(113);
+ setState(124);
trailer();
}
break;
case SEMICOLON:
{
- setState(114);
+ setState(125);
empty();
}
break;
@@ -676,67 +670,47 @@ public final StatementContext statement() throws RecognitionException {
}
break;
case 3:
- _localctx = new DoContext(_localctx);
- enterOuterAlt(_localctx, 3);
- {
- setState(117);
- match(DO);
- setState(118);
- block();
- setState(119);
- match(WHILE);
- setState(120);
- match(LP);
- setState(121);
- expression(0);
- setState(122);
- match(RP);
- setState(123);
- delimiter();
- }
- break;
- case 4:
_localctx = new ForContext(_localctx);
- enterOuterAlt(_localctx, 4);
+ enterOuterAlt(_localctx, 3);
{
- setState(125);
+ setState(128);
match(FOR);
- setState(126);
+ setState(129);
match(LP);
- setState(128);
+ setState(131);
_la = _input.LA(1);
if ((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LBRACE) | (1L << LP) | (1L << NEW) | (1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB) | (1L << INCR) | (1L << DECR))) != 0) || ((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)) | (1L << (STRING - 72)) | (1L << (REGEX - 72)) | (1L << (TRUE - 72)) | (1L << (FALSE - 72)) | (1L << (NULL - 72)) | (1L << (TYPE - 72)) | (1L << (ID - 72)))) != 0)) {
{
- setState(127);
+ setState(130);
initializer();
}
}
- setState(130);
+ setState(133);
match(SEMICOLON);
- setState(132);
+ setState(135);
_la = _input.LA(1);
if ((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LBRACE) | (1L << LP) | (1L << NEW) | (1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB) | (1L << INCR) | (1L << DECR))) != 0) || ((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)) | (1L << (STRING - 72)) | (1L << (REGEX - 72)) | (1L << (TRUE - 72)) | (1L << (FALSE - 72)) | (1L << (NULL - 72)) | (1L << (TYPE - 72)) | (1L << (ID - 72)))) != 0)) {
{
- setState(131);
+ setState(134);
expression(0);
}
}
- setState(134);
+ setState(137);
match(SEMICOLON);
- setState(136);
+ setState(139);
_la = _input.LA(1);
if ((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LBRACE) | (1L << LP) | (1L << NEW) | (1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB) | (1L << INCR) | (1L << DECR))) != 0) || ((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)) | (1L << (STRING - 72)) | (1L << (REGEX - 72)) | (1L << (TRUE - 72)) | (1L << (FALSE - 72)) | (1L << (NULL - 72)) | (1L << (TYPE - 72)) | (1L << (ID - 72)))) != 0)) {
{
- setState(135);
+ setState(138);
afterthought();
}
}
- setState(138);
- match(RP);
setState(141);
+ match(RP);
+ setState(144);
switch (_input.LA(1)) {
case LBRACK:
case LBRACE:
@@ -769,13 +743,13 @@ public final StatementContext statement() throws RecognitionException {
case TYPE:
case ID:
{
- setState(139);
+ setState(142);
trailer();
}
break;
case SEMICOLON:
{
- setState(140);
+ setState(143);
empty();
}
break;
@@ -784,99 +758,57 @@ public final StatementContext statement() throws RecognitionException {
}
}
break;
- case 5:
+ case 4:
_localctx = new EachContext(_localctx);
- enterOuterAlt(_localctx, 5);
+ enterOuterAlt(_localctx, 4);
{
- setState(143);
+ setState(146);
match(FOR);
- setState(144);
+ setState(147);
match(LP);
- setState(145);
+ setState(148);
decltype();
- setState(146);
+ setState(149);
match(ID);
- setState(147);
+ setState(150);
match(COLON);
- setState(148);
+ setState(151);
expression(0);
- setState(149);
+ setState(152);
match(RP);
- setState(150);
+ setState(153);
trailer();
}
break;
- case 6:
+ case 5:
_localctx = new IneachContext(_localctx);
- enterOuterAlt(_localctx, 6);
+ enterOuterAlt(_localctx, 5);
{
- setState(152);
+ setState(155);
match(FOR);
- setState(153);
+ setState(156);
match(LP);
- setState(154);
+ setState(157);
match(ID);
- setState(155);
+ setState(158);
match(IN);
- setState(156);
+ setState(159);
expression(0);
- setState(157);
- match(RP);
- setState(158);
- trailer();
- }
- break;
- case 7:
- _localctx = new DeclContext(_localctx);
- enterOuterAlt(_localctx, 7);
- {
setState(160);
- declaration();
+ match(RP);
setState(161);
- delimiter();
- }
- break;
- case 8:
- _localctx = new ContinueContext(_localctx);
- enterOuterAlt(_localctx, 8);
- {
- setState(163);
- match(CONTINUE);
- setState(164);
- delimiter();
- }
- break;
- case 9:
- _localctx = new BreakContext(_localctx);
- enterOuterAlt(_localctx, 9);
- {
- setState(165);
- match(BREAK);
- setState(166);
- delimiter();
- }
- break;
- case 10:
- _localctx = new ReturnContext(_localctx);
- enterOuterAlt(_localctx, 10);
- {
- setState(167);
- match(RETURN);
- setState(168);
- expression(0);
- setState(169);
- delimiter();
+ trailer();
}
break;
- case 11:
+ case 6:
_localctx = new TryContext(_localctx);
- enterOuterAlt(_localctx, 11);
+ enterOuterAlt(_localctx, 6);
{
- setState(171);
+ setState(163);
match(TRY);
- setState(172);
+ setState(164);
block();
- setState(174);
+ setState(166);
_errHandler.sync(this);
_alt = 1;
do {
@@ -884,7 +816,7 @@ public final StatementContext statement() throws RecognitionException {
case 1:
{
{
- setState(173);
+ setState(165);
trap();
}
}
@@ -892,32 +824,194 @@ public final StatementContext statement() throws RecognitionException {
default:
throw new NoViableAltException(this);
}
- setState(176);
+ setState(168);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,10,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,12,_ctx);
} while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER );
}
break;
- case 12:
+ }
+ }
+ catch (RecognitionException re) {
+ _localctx.exception = re;
+ _errHandler.reportError(this, re);
+ _errHandler.recover(this, re);
+ }
+ finally {
+ exitRule();
+ }
+ return _localctx;
+ }
+
+ public static class DstatementContext extends ParserRuleContext {
+ public DstatementContext(ParserRuleContext parent, int invokingState) {
+ super(parent, invokingState);
+ }
+ @Override public int getRuleIndex() { return RULE_dstatement; }
+
+ public DstatementContext() { }
+ public void copyFrom(DstatementContext ctx) {
+ super.copyFrom(ctx);
+ }
+ }
+ public static class DeclContext extends DstatementContext {
+ public DeclarationContext declaration() {
+ return getRuleContext(DeclarationContext.class,0);
+ }
+ public DeclContext(DstatementContext ctx) { copyFrom(ctx); }
+ @Override
+ public T accept(ParseTreeVisitor extends T> visitor) {
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitDecl(this);
+ else return visitor.visitChildren(this);
+ }
+ }
+ public static class BreakContext extends DstatementContext {
+ public TerminalNode BREAK() { return getToken(PainlessParser.BREAK, 0); }
+ public BreakContext(DstatementContext ctx) { copyFrom(ctx); }
+ @Override
+ public T accept(ParseTreeVisitor extends T> visitor) {
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitBreak(this);
+ else return visitor.visitChildren(this);
+ }
+ }
+ public static class ThrowContext extends DstatementContext {
+ public TerminalNode THROW() { return getToken(PainlessParser.THROW, 0); }
+ public ExpressionContext expression() {
+ return getRuleContext(ExpressionContext.class,0);
+ }
+ public ThrowContext(DstatementContext ctx) { copyFrom(ctx); }
+ @Override
+ public T accept(ParseTreeVisitor extends T> visitor) {
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitThrow(this);
+ else return visitor.visitChildren(this);
+ }
+ }
+ public static class ContinueContext extends DstatementContext {
+ public TerminalNode CONTINUE() { return getToken(PainlessParser.CONTINUE, 0); }
+ public ContinueContext(DstatementContext ctx) { copyFrom(ctx); }
+ @Override
+ public T accept(ParseTreeVisitor extends T> visitor) {
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitContinue(this);
+ else return visitor.visitChildren(this);
+ }
+ }
+ public static class ExprContext extends DstatementContext {
+ public ExpressionContext expression() {
+ return getRuleContext(ExpressionContext.class,0);
+ }
+ public ExprContext(DstatementContext ctx) { copyFrom(ctx); }
+ @Override
+ public T accept(ParseTreeVisitor extends T> visitor) {
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitExpr(this);
+ else return visitor.visitChildren(this);
+ }
+ }
+ public static class DoContext extends DstatementContext {
+ public TerminalNode DO() { return getToken(PainlessParser.DO, 0); }
+ public BlockContext block() {
+ return getRuleContext(BlockContext.class,0);
+ }
+ public TerminalNode WHILE() { return getToken(PainlessParser.WHILE, 0); }
+ public TerminalNode LP() { return getToken(PainlessParser.LP, 0); }
+ public ExpressionContext expression() {
+ return getRuleContext(ExpressionContext.class,0);
+ }
+ public TerminalNode RP() { return getToken(PainlessParser.RP, 0); }
+ public DoContext(DstatementContext ctx) { copyFrom(ctx); }
+ @Override
+ public T accept(ParseTreeVisitor extends T> visitor) {
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitDo(this);
+ else return visitor.visitChildren(this);
+ }
+ }
+ public static class ReturnContext extends DstatementContext {
+ public TerminalNode RETURN() { return getToken(PainlessParser.RETURN, 0); }
+ public ExpressionContext expression() {
+ return getRuleContext(ExpressionContext.class,0);
+ }
+ public ReturnContext(DstatementContext ctx) { copyFrom(ctx); }
+ @Override
+ public T accept(ParseTreeVisitor extends T> visitor) {
+ if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitReturn(this);
+ else return visitor.visitChildren(this);
+ }
+ }
+
+ public final DstatementContext dstatement() throws RecognitionException {
+ DstatementContext _localctx = new DstatementContext(_ctx, getState());
+ enterRule(_localctx, 10, RULE_dstatement);
+ try {
+ setState(187);
+ _errHandler.sync(this);
+ switch ( getInterpreter().adaptivePredict(_input,14,_ctx) ) {
+ case 1:
+ _localctx = new DoContext(_localctx);
+ enterOuterAlt(_localctx, 1);
+ {
+ setState(172);
+ match(DO);
+ setState(173);
+ block();
+ setState(174);
+ match(WHILE);
+ setState(175);
+ match(LP);
+ setState(176);
+ expression(0);
+ setState(177);
+ match(RP);
+ }
+ break;
+ case 2:
+ _localctx = new DeclContext(_localctx);
+ enterOuterAlt(_localctx, 2);
+ {
+ setState(179);
+ declaration();
+ }
+ break;
+ case 3:
+ _localctx = new ContinueContext(_localctx);
+ enterOuterAlt(_localctx, 3);
+ {
+ setState(180);
+ match(CONTINUE);
+ }
+ break;
+ case 4:
+ _localctx = new BreakContext(_localctx);
+ enterOuterAlt(_localctx, 4);
+ {
+ setState(181);
+ match(BREAK);
+ }
+ break;
+ case 5:
+ _localctx = new ReturnContext(_localctx);
+ enterOuterAlt(_localctx, 5);
+ {
+ setState(182);
+ match(RETURN);
+ setState(183);
+ expression(0);
+ }
+ break;
+ case 6:
_localctx = new ThrowContext(_localctx);
- enterOuterAlt(_localctx, 12);
+ enterOuterAlt(_localctx, 6);
{
- setState(178);
+ setState(184);
match(THROW);
- setState(179);
+ setState(185);
expression(0);
- setState(180);
- delimiter();
}
break;
- case 13:
+ case 7:
_localctx = new ExprContext(_localctx);
- enterOuterAlt(_localctx, 13);
+ enterOuterAlt(_localctx, 7);
{
- setState(182);
+ setState(186);
expression(0);
- setState(183);
- delimiter();
}
break;
}
@@ -953,14 +1047,14 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final TrailerContext trailer() throws RecognitionException {
TrailerContext _localctx = new TrailerContext(_ctx, getState());
- enterRule(_localctx, 8, RULE_trailer);
+ enterRule(_localctx, 12, RULE_trailer);
try {
- setState(189);
+ setState(191);
switch (_input.LA(1)) {
case LBRACK:
enterOuterAlt(_localctx, 1);
{
- setState(187);
+ setState(189);
block();
}
break;
@@ -995,7 +1089,7 @@ public final TrailerContext trailer() throws RecognitionException {
case ID:
enterOuterAlt(_localctx, 2);
{
- setState(188);
+ setState(190);
statement();
}
break;
@@ -1023,6 +1117,9 @@ public List statement() {
public StatementContext statement(int i) {
return getRuleContext(StatementContext.class,i);
}
+ public DstatementContext dstatement() {
+ return getRuleContext(DstatementContext.class,0);
+ }
public BlockContext(ParserRuleContext parent, int invokingState) {
super(parent, invokingState);
}
@@ -1036,28 +1133,40 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final BlockContext block() throws RecognitionException {
BlockContext _localctx = new BlockContext(_ctx, getState());
- enterRule(_localctx, 10, RULE_block);
+ enterRule(_localctx, 14, RULE_block);
int _la;
try {
+ int _alt;
enterOuterAlt(_localctx, 1);
{
- setState(191);
+ setState(193);
match(LBRACK);
- setState(195);
+ setState(197);
_errHandler.sync(this);
+ _alt = getInterpreter().adaptivePredict(_input,16,_ctx);
+ while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
+ if ( _alt==1 ) {
+ {
+ {
+ setState(194);
+ statement();
+ }
+ }
+ }
+ setState(199);
+ _errHandler.sync(this);
+ _alt = getInterpreter().adaptivePredict(_input,16,_ctx);
+ }
+ setState(201);
_la = _input.LA(1);
- while ((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LBRACE) | (1L << LP) | (1L << IF) | (1L << WHILE) | (1L << DO) | (1L << FOR) | (1L << CONTINUE) | (1L << BREAK) | (1L << RETURN) | (1L << NEW) | (1L << TRY) | (1L << THROW) | (1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB) | (1L << INCR) | (1L << DECR))) != 0) || ((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)) | (1L << (STRING - 72)) | (1L << (REGEX - 72)) | (1L << (TRUE - 72)) | (1L << (FALSE - 72)) | (1L << (NULL - 72)) | (1L << (TYPE - 72)) | (1L << (ID - 72)))) != 0)) {
- {
+ if ((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LBRACE) | (1L << LP) | (1L << DO) | (1L << CONTINUE) | (1L << BREAK) | (1L << RETURN) | (1L << NEW) | (1L << THROW) | (1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB) | (1L << INCR) | (1L << DECR))) != 0) || ((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)) | (1L << (STRING - 72)) | (1L << (REGEX - 72)) | (1L << (TRUE - 72)) | (1L << (FALSE - 72)) | (1L << (NULL - 72)) | (1L << (TYPE - 72)) | (1L << (ID - 72)))) != 0)) {
{
- setState(192);
- statement();
- }
+ setState(200);
+ dstatement();
}
- setState(197);
- _errHandler.sync(this);
- _la = _input.LA(1);
}
- setState(198);
+
+ setState(203);
match(RBRACK);
}
}
@@ -1087,11 +1196,11 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final EmptyContext empty() throws RecognitionException {
EmptyContext _localctx = new EmptyContext(_ctx, getState());
- enterRule(_localctx, 12, RULE_empty);
+ enterRule(_localctx, 16, RULE_empty);
try {
enterOuterAlt(_localctx, 1);
{
- setState(200);
+ setState(205);
match(SEMICOLON);
}
}
@@ -1126,22 +1235,22 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final InitializerContext initializer() throws RecognitionException {
InitializerContext _localctx = new InitializerContext(_ctx, getState());
- enterRule(_localctx, 14, RULE_initializer);
+ enterRule(_localctx, 18, RULE_initializer);
try {
- setState(204);
+ setState(209);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,14,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,18,_ctx) ) {
case 1:
enterOuterAlt(_localctx, 1);
{
- setState(202);
+ setState(207);
declaration();
}
break;
case 2:
enterOuterAlt(_localctx, 2);
{
- setState(203);
+ setState(208);
expression(0);
}
break;
@@ -1175,11 +1284,11 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final AfterthoughtContext afterthought() throws RecognitionException {
AfterthoughtContext _localctx = new AfterthoughtContext(_ctx, getState());
- enterRule(_localctx, 16, RULE_afterthought);
+ enterRule(_localctx, 20, RULE_afterthought);
try {
enterOuterAlt(_localctx, 1);
{
- setState(206);
+ setState(211);
expression(0);
}
}
@@ -1221,28 +1330,28 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final DeclarationContext declaration() throws RecognitionException {
DeclarationContext _localctx = new DeclarationContext(_ctx, getState());
- enterRule(_localctx, 18, RULE_declaration);
+ enterRule(_localctx, 22, RULE_declaration);
int _la;
try {
enterOuterAlt(_localctx, 1);
{
- setState(208);
+ setState(213);
decltype();
- setState(209);
- declvar();
setState(214);
+ declvar();
+ setState(219);
_errHandler.sync(this);
_la = _input.LA(1);
while (_la==COMMA) {
{
{
- setState(210);
+ setState(215);
match(COMMA);
- setState(211);
+ setState(216);
declvar();
}
}
- setState(216);
+ setState(221);
_errHandler.sync(this);
_la = _input.LA(1);
}
@@ -1282,30 +1391,30 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final DecltypeContext decltype() throws RecognitionException {
DecltypeContext _localctx = new DecltypeContext(_ctx, getState());
- enterRule(_localctx, 20, RULE_decltype);
+ enterRule(_localctx, 24, RULE_decltype);
try {
int _alt;
enterOuterAlt(_localctx, 1);
{
- setState(217);
- match(TYPE);
setState(222);
+ match(TYPE);
+ setState(227);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,16,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,20,_ctx);
while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
if ( _alt==1 ) {
{
{
- setState(218);
+ setState(223);
match(LBRACE);
- setState(219);
+ setState(224);
match(RBRACE);
}
}
}
- setState(224);
+ setState(229);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,16,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,20,_ctx);
}
}
}
@@ -1339,20 +1448,20 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final DeclvarContext declvar() throws RecognitionException {
DeclvarContext _localctx = new DeclvarContext(_ctx, getState());
- enterRule(_localctx, 22, RULE_declvar);
+ enterRule(_localctx, 26, RULE_declvar);
int _la;
try {
enterOuterAlt(_localctx, 1);
{
- setState(225);
+ setState(230);
match(ID);
- setState(228);
+ setState(233);
_la = _input.LA(1);
if (_la==ASSIGN) {
{
- setState(226);
+ setState(231);
match(ASSIGN);
- setState(227);
+ setState(232);
expression(0);
}
}
@@ -1392,21 +1501,21 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final TrapContext trap() throws RecognitionException {
TrapContext _localctx = new TrapContext(_ctx, getState());
- enterRule(_localctx, 24, RULE_trap);
+ enterRule(_localctx, 28, RULE_trap);
try {
enterOuterAlt(_localctx, 1);
{
- setState(230);
+ setState(235);
match(CATCH);
- setState(231);
+ setState(236);
match(LP);
- setState(232);
+ setState(237);
match(TYPE);
- setState(233);
+ setState(238);
match(ID);
- setState(234);
+ setState(239);
match(RP);
- setState(235);
+ setState(240);
block();
}
}
@@ -1421,47 +1530,6 @@ public final TrapContext trap() throws RecognitionException {
return _localctx;
}
- public static class DelimiterContext extends ParserRuleContext {
- public TerminalNode SEMICOLON() { return getToken(PainlessParser.SEMICOLON, 0); }
- public TerminalNode EOF() { return getToken(PainlessParser.EOF, 0); }
- public DelimiterContext(ParserRuleContext parent, int invokingState) {
- super(parent, invokingState);
- }
- @Override public int getRuleIndex() { return RULE_delimiter; }
- @Override
- public T accept(ParseTreeVisitor extends T> visitor) {
- if ( visitor instanceof PainlessParserVisitor ) return ((PainlessParserVisitor extends T>)visitor).visitDelimiter(this);
- else return visitor.visitChildren(this);
- }
- }
-
- public final DelimiterContext delimiter() throws RecognitionException {
- DelimiterContext _localctx = new DelimiterContext(_ctx, getState());
- enterRule(_localctx, 26, RULE_delimiter);
- int _la;
- try {
- enterOuterAlt(_localctx, 1);
- {
- setState(237);
- _la = _input.LA(1);
- if ( !(_la==EOF || _la==SEMICOLON) ) {
- _errHandler.recoverInline(this);
- } else {
- consume();
- }
- }
- }
- catch (RecognitionException re) {
- _localctx.exception = re;
- _errHandler.reportError(this, re);
- _errHandler.recover(this, re);
- }
- finally {
- exitRule();
- }
- return _localctx;
- }
-
public static class ExpressionContext extends ParserRuleContext {
public ExpressionContext(ParserRuleContext parent, int invokingState) {
super(parent, invokingState);
@@ -1631,8 +1699,8 @@ private ExpressionContext expression(int _p) throws RecognitionException {
int _parentState = getState();
ExpressionContext _localctx = new ExpressionContext(_ctx, _parentState);
ExpressionContext _prevctx = _localctx;
- int _startState = 28;
- enterRecursionRule(_localctx, 28, RULE_expression, _p);
+ int _startState = 30;
+ enterRecursionRule(_localctx, 30, RULE_expression, _p);
int _la;
try {
int _alt;
@@ -1643,35 +1711,35 @@ private ExpressionContext expression(int _p) throws RecognitionException {
_ctx = _localctx;
_prevctx = _localctx;
- setState(240);
+ setState(243);
unary();
}
_ctx.stop = _input.LT(-1);
- setState(292);
+ setState(295);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,19,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,23,_ctx);
while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
if ( _alt==1 ) {
if ( _parseListeners!=null ) triggerExitRuleEvent();
_prevctx = _localctx;
{
- setState(290);
+ setState(293);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,18,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,22,_ctx) ) {
case 1:
{
_localctx = new BinaryContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(242);
+ setState(245);
if (!(precpred(_ctx, 15))) throw new FailedPredicateException(this, "precpred(_ctx, 15)");
- setState(243);
+ setState(246);
_la = _input.LA(1);
if ( !((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << MUL) | (1L << DIV) | (1L << REM))) != 0)) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(244);
+ setState(247);
expression(16);
}
break;
@@ -1679,16 +1747,16 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new BinaryContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(245);
+ setState(248);
if (!(precpred(_ctx, 14))) throw new FailedPredicateException(this, "precpred(_ctx, 14)");
- setState(246);
+ setState(249);
_la = _input.LA(1);
if ( !(_la==ADD || _la==SUB) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(247);
+ setState(250);
expression(15);
}
break;
@@ -1696,16 +1764,16 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new BinaryContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(248);
+ setState(251);
if (!(precpred(_ctx, 13))) throw new FailedPredicateException(this, "precpred(_ctx, 13)");
- setState(249);
+ setState(252);
_la = _input.LA(1);
if ( !(_la==FIND || _la==MATCH) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(250);
+ setState(253);
expression(14);
}
break;
@@ -1713,16 +1781,16 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new BinaryContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(251);
+ setState(254);
if (!(precpred(_ctx, 12))) throw new FailedPredicateException(this, "precpred(_ctx, 12)");
- setState(252);
+ setState(255);
_la = _input.LA(1);
if ( !((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LSH) | (1L << RSH) | (1L << USH))) != 0)) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(253);
+ setState(256);
expression(13);
}
break;
@@ -1730,16 +1798,16 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new CompContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(254);
+ setState(257);
if (!(precpred(_ctx, 11))) throw new FailedPredicateException(this, "precpred(_ctx, 11)");
- setState(255);
+ setState(258);
_la = _input.LA(1);
if ( !((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LT) | (1L << LTE) | (1L << GT) | (1L << GTE))) != 0)) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(256);
+ setState(259);
expression(12);
}
break;
@@ -1747,16 +1815,16 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new CompContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(257);
+ setState(260);
if (!(precpred(_ctx, 9))) throw new FailedPredicateException(this, "precpred(_ctx, 9)");
- setState(258);
+ setState(261);
_la = _input.LA(1);
if ( !((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << EQ) | (1L << EQR) | (1L << NE) | (1L << NER))) != 0)) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(259);
+ setState(262);
expression(10);
}
break;
@@ -1764,11 +1832,11 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new BinaryContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(260);
+ setState(263);
if (!(precpred(_ctx, 8))) throw new FailedPredicateException(this, "precpred(_ctx, 8)");
- setState(261);
+ setState(264);
match(BWAND);
- setState(262);
+ setState(265);
expression(9);
}
break;
@@ -1776,11 +1844,11 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new BinaryContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(263);
+ setState(266);
if (!(precpred(_ctx, 7))) throw new FailedPredicateException(this, "precpred(_ctx, 7)");
- setState(264);
+ setState(267);
match(XOR);
- setState(265);
+ setState(268);
expression(8);
}
break;
@@ -1788,11 +1856,11 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new BinaryContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(266);
+ setState(269);
if (!(precpred(_ctx, 6))) throw new FailedPredicateException(this, "precpred(_ctx, 6)");
- setState(267);
+ setState(270);
match(BWOR);
- setState(268);
+ setState(271);
expression(7);
}
break;
@@ -1800,11 +1868,11 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new BoolContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(269);
+ setState(272);
if (!(precpred(_ctx, 5))) throw new FailedPredicateException(this, "precpred(_ctx, 5)");
- setState(270);
+ setState(273);
match(BOOLAND);
- setState(271);
+ setState(274);
expression(6);
}
break;
@@ -1812,11 +1880,11 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new BoolContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(272);
+ setState(275);
if (!(precpred(_ctx, 4))) throw new FailedPredicateException(this, "precpred(_ctx, 4)");
- setState(273);
+ setState(276);
match(BOOLOR);
- setState(274);
+ setState(277);
expression(5);
}
break;
@@ -1824,15 +1892,15 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new ConditionalContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(275);
+ setState(278);
if (!(precpred(_ctx, 3))) throw new FailedPredicateException(this, "precpred(_ctx, 3)");
- setState(276);
+ setState(279);
match(COND);
- setState(277);
+ setState(280);
expression(0);
- setState(278);
+ setState(281);
match(COLON);
- setState(279);
+ setState(282);
expression(3);
}
break;
@@ -1840,11 +1908,11 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new ElvisContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(281);
+ setState(284);
if (!(precpred(_ctx, 2))) throw new FailedPredicateException(this, "precpred(_ctx, 2)");
- setState(282);
+ setState(285);
match(ELVIS);
- setState(283);
+ setState(286);
expression(2);
}
break;
@@ -1852,16 +1920,16 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new AssignmentContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(284);
+ setState(287);
if (!(precpred(_ctx, 1))) throw new FailedPredicateException(this, "precpred(_ctx, 1)");
- setState(285);
+ setState(288);
_la = _input.LA(1);
if ( !(((((_la - 60)) & ~0x3f) == 0 && ((1L << (_la - 60)) & ((1L << (ASSIGN - 60)) | (1L << (AADD - 60)) | (1L << (ASUB - 60)) | (1L << (AMUL - 60)) | (1L << (ADIV - 60)) | (1L << (AREM - 60)) | (1L << (AAND - 60)) | (1L << (AXOR - 60)) | (1L << (AOR - 60)) | (1L << (ALSH - 60)) | (1L << (ARSH - 60)) | (1L << (AUSH - 60)))) != 0)) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(286);
+ setState(289);
expression(1);
}
break;
@@ -1869,20 +1937,20 @@ private ExpressionContext expression(int _p) throws RecognitionException {
{
_localctx = new InstanceofContext(new ExpressionContext(_parentctx, _parentState));
pushNewRecursionContext(_localctx, _startState, RULE_expression);
- setState(287);
+ setState(290);
if (!(precpred(_ctx, 10))) throw new FailedPredicateException(this, "precpred(_ctx, 10)");
- setState(288);
+ setState(291);
match(INSTANCEOF);
- setState(289);
+ setState(292);
decltype();
}
break;
}
}
}
- setState(294);
+ setState(297);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,19,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,23,_ctx);
}
}
}
@@ -1979,24 +2047,24 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final UnaryContext unary() throws RecognitionException {
UnaryContext _localctx = new UnaryContext(_ctx, getState());
- enterRule(_localctx, 30, RULE_unary);
+ enterRule(_localctx, 32, RULE_unary);
int _la;
try {
- setState(308);
+ setState(311);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,20,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,24,_ctx) ) {
case 1:
_localctx = new PreContext(_localctx);
enterOuterAlt(_localctx, 1);
{
- setState(295);
+ setState(298);
_la = _input.LA(1);
if ( !(_la==INCR || _la==DECR) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(296);
+ setState(299);
chain();
}
break;
@@ -2004,9 +2072,9 @@ public final UnaryContext unary() throws RecognitionException {
_localctx = new PostContext(_localctx);
enterOuterAlt(_localctx, 2);
{
- setState(297);
+ setState(300);
chain();
- setState(298);
+ setState(301);
_la = _input.LA(1);
if ( !(_la==INCR || _la==DECR) ) {
_errHandler.recoverInline(this);
@@ -2019,7 +2087,7 @@ public final UnaryContext unary() throws RecognitionException {
_localctx = new ReadContext(_localctx);
enterOuterAlt(_localctx, 3);
{
- setState(300);
+ setState(303);
chain();
}
break;
@@ -2027,14 +2095,14 @@ public final UnaryContext unary() throws RecognitionException {
_localctx = new OperatorContext(_localctx);
enterOuterAlt(_localctx, 4);
{
- setState(301);
+ setState(304);
_la = _input.LA(1);
if ( !((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB))) != 0)) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(302);
+ setState(305);
unary();
}
break;
@@ -2042,13 +2110,13 @@ public final UnaryContext unary() throws RecognitionException {
_localctx = new CastContext(_localctx);
enterOuterAlt(_localctx, 5);
{
- setState(303);
+ setState(306);
match(LP);
- setState(304);
+ setState(307);
decltype();
- setState(305);
+ setState(308);
match(RP);
- setState(306);
+ setState(309);
unary();
}
break;
@@ -2127,33 +2195,33 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final ChainContext chain() throws RecognitionException {
ChainContext _localctx = new ChainContext(_ctx, getState());
- enterRule(_localctx, 32, RULE_chain);
+ enterRule(_localctx, 34, RULE_chain);
try {
int _alt;
- setState(326);
+ setState(329);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,23,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,27,_ctx) ) {
case 1:
_localctx = new DynamicContext(_localctx);
enterOuterAlt(_localctx, 1);
{
- setState(310);
+ setState(313);
primary();
- setState(314);
+ setState(317);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,21,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,25,_ctx);
while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
if ( _alt==1 ) {
{
{
- setState(311);
+ setState(314);
postfix();
}
}
}
- setState(316);
+ setState(319);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,21,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,25,_ctx);
}
}
break;
@@ -2161,25 +2229,25 @@ public final ChainContext chain() throws RecognitionException {
_localctx = new StaticContext(_localctx);
enterOuterAlt(_localctx, 2);
{
- setState(317);
+ setState(320);
decltype();
- setState(318);
+ setState(321);
postdot();
- setState(322);
+ setState(325);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,22,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,26,_ctx);
while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
if ( _alt==1 ) {
{
{
- setState(319);
+ setState(322);
postfix();
}
}
}
- setState(324);
+ setState(327);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,22,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,26,_ctx);
}
}
break;
@@ -2187,7 +2255,7 @@ public final ChainContext chain() throws RecognitionException {
_localctx = new NewarrayContext(_localctx);
enterOuterAlt(_localctx, 3);
{
- setState(325);
+ setState(328);
arrayinitializer();
}
break;
@@ -2344,21 +2412,21 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final PrimaryContext primary() throws RecognitionException {
PrimaryContext _localctx = new PrimaryContext(_ctx, getState());
- enterRule(_localctx, 34, RULE_primary);
+ enterRule(_localctx, 36, RULE_primary);
int _la;
try {
- setState(346);
+ setState(349);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,24,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,28,_ctx) ) {
case 1:
_localctx = new PrecedenceContext(_localctx);
enterOuterAlt(_localctx, 1);
{
- setState(328);
+ setState(331);
match(LP);
- setState(329);
+ setState(332);
expression(0);
- setState(330);
+ setState(333);
match(RP);
}
break;
@@ -2366,7 +2434,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new NumericContext(_localctx);
enterOuterAlt(_localctx, 2);
{
- setState(332);
+ setState(335);
_la = _input.LA(1);
if ( !(((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)))) != 0)) ) {
_errHandler.recoverInline(this);
@@ -2379,7 +2447,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new TrueContext(_localctx);
enterOuterAlt(_localctx, 3);
{
- setState(333);
+ setState(336);
match(TRUE);
}
break;
@@ -2387,7 +2455,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new FalseContext(_localctx);
enterOuterAlt(_localctx, 4);
{
- setState(334);
+ setState(337);
match(FALSE);
}
break;
@@ -2395,7 +2463,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new NullContext(_localctx);
enterOuterAlt(_localctx, 5);
{
- setState(335);
+ setState(338);
match(NULL);
}
break;
@@ -2403,7 +2471,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new StringContext(_localctx);
enterOuterAlt(_localctx, 6);
{
- setState(336);
+ setState(339);
match(STRING);
}
break;
@@ -2411,7 +2479,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new RegexContext(_localctx);
enterOuterAlt(_localctx, 7);
{
- setState(337);
+ setState(340);
match(REGEX);
}
break;
@@ -2419,7 +2487,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new ListinitContext(_localctx);
enterOuterAlt(_localctx, 8);
{
- setState(338);
+ setState(341);
listinitializer();
}
break;
@@ -2427,7 +2495,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new MapinitContext(_localctx);
enterOuterAlt(_localctx, 9);
{
- setState(339);
+ setState(342);
mapinitializer();
}
break;
@@ -2435,7 +2503,7 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new VariableContext(_localctx);
enterOuterAlt(_localctx, 10);
{
- setState(340);
+ setState(343);
match(ID);
}
break;
@@ -2443,9 +2511,9 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new CalllocalContext(_localctx);
enterOuterAlt(_localctx, 11);
{
- setState(341);
+ setState(344);
match(ID);
- setState(342);
+ setState(345);
arguments();
}
break;
@@ -2453,11 +2521,11 @@ public final PrimaryContext primary() throws RecognitionException {
_localctx = new NewobjectContext(_localctx);
enterOuterAlt(_localctx, 12);
{
- setState(343);
+ setState(346);
match(NEW);
- setState(344);
+ setState(347);
match(TYPE);
- setState(345);
+ setState(348);
arguments();
}
break;
@@ -2497,29 +2565,29 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final PostfixContext postfix() throws RecognitionException {
PostfixContext _localctx = new PostfixContext(_ctx, getState());
- enterRule(_localctx, 36, RULE_postfix);
+ enterRule(_localctx, 38, RULE_postfix);
try {
- setState(351);
+ setState(354);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,25,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,29,_ctx) ) {
case 1:
enterOuterAlt(_localctx, 1);
{
- setState(348);
+ setState(351);
callinvoke();
}
break;
case 2:
enterOuterAlt(_localctx, 2);
{
- setState(349);
+ setState(352);
fieldaccess();
}
break;
case 3:
enterOuterAlt(_localctx, 3);
{
- setState(350);
+ setState(353);
braceaccess();
}
break;
@@ -2556,22 +2624,22 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final PostdotContext postdot() throws RecognitionException {
PostdotContext _localctx = new PostdotContext(_ctx, getState());
- enterRule(_localctx, 38, RULE_postdot);
+ enterRule(_localctx, 40, RULE_postdot);
try {
- setState(355);
+ setState(358);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,26,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,30,_ctx) ) {
case 1:
enterOuterAlt(_localctx, 1);
{
- setState(353);
+ setState(356);
callinvoke();
}
break;
case 2:
enterOuterAlt(_localctx, 2);
{
- setState(354);
+ setState(357);
fieldaccess();
}
break;
@@ -2608,21 +2676,21 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final CallinvokeContext callinvoke() throws RecognitionException {
CallinvokeContext _localctx = new CallinvokeContext(_ctx, getState());
- enterRule(_localctx, 40, RULE_callinvoke);
+ enterRule(_localctx, 42, RULE_callinvoke);
int _la;
try {
enterOuterAlt(_localctx, 1);
{
- setState(357);
+ setState(360);
_la = _input.LA(1);
if ( !(_la==DOT || _la==NSDOT) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(358);
+ setState(361);
match(DOTID);
- setState(359);
+ setState(362);
arguments();
}
}
@@ -2655,19 +2723,19 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final FieldaccessContext fieldaccess() throws RecognitionException {
FieldaccessContext _localctx = new FieldaccessContext(_ctx, getState());
- enterRule(_localctx, 42, RULE_fieldaccess);
+ enterRule(_localctx, 44, RULE_fieldaccess);
int _la;
try {
enterOuterAlt(_localctx, 1);
{
- setState(361);
+ setState(364);
_la = _input.LA(1);
if ( !(_la==DOT || _la==NSDOT) ) {
_errHandler.recoverInline(this);
} else {
consume();
}
- setState(362);
+ setState(365);
_la = _input.LA(1);
if ( !(_la==DOTINTEGER || _la==DOTID) ) {
_errHandler.recoverInline(this);
@@ -2706,15 +2774,15 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final BraceaccessContext braceaccess() throws RecognitionException {
BraceaccessContext _localctx = new BraceaccessContext(_ctx, getState());
- enterRule(_localctx, 44, RULE_braceaccess);
+ enterRule(_localctx, 46, RULE_braceaccess);
try {
enterOuterAlt(_localctx, 1);
{
- setState(364);
+ setState(367);
match(LBRACE);
- setState(365);
+ setState(368);
expression(0);
- setState(366);
+ setState(369);
match(RBRACE);
}
}
@@ -2786,7 +2854,6 @@ public List expression() {
public ExpressionContext expression(int i) {
return getRuleContext(ExpressionContext.class,i);
}
- public TerminalNode SEMICOLON() { return getToken(PainlessParser.SEMICOLON, 0); }
public List postfix() {
return getRuleContexts(PostfixContext.class);
}
@@ -2807,22 +2874,22 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final ArrayinitializerContext arrayinitializer() throws RecognitionException {
ArrayinitializerContext _localctx = new ArrayinitializerContext(_ctx, getState());
- enterRule(_localctx, 46, RULE_arrayinitializer);
+ enterRule(_localctx, 48, RULE_arrayinitializer);
int _la;
try {
int _alt;
setState(412);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,34,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,37,_ctx) ) {
case 1:
_localctx = new NewstandardarrayContext(_localctx);
enterOuterAlt(_localctx, 1);
{
- setState(368);
+ setState(371);
match(NEW);
- setState(369);
+ setState(372);
match(TYPE);
- setState(374);
+ setState(377);
_errHandler.sync(this);
_alt = 1;
do {
@@ -2830,11 +2897,11 @@ public final ArrayinitializerContext arrayinitializer() throws RecognitionExcept
case 1:
{
{
- setState(370);
+ setState(373);
match(LBRACE);
- setState(371);
+ setState(374);
expression(0);
- setState(372);
+ setState(375);
match(RBRACE);
}
}
@@ -2842,32 +2909,32 @@ public final ArrayinitializerContext arrayinitializer() throws RecognitionExcept
default:
throw new NoViableAltException(this);
}
- setState(376);
+ setState(379);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,27,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,31,_ctx);
} while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER );
- setState(385);
+ setState(388);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,29,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,33,_ctx) ) {
case 1:
{
- setState(378);
+ setState(381);
postdot();
- setState(382);
+ setState(385);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,28,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,32,_ctx);
while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
if ( _alt==1 ) {
{
{
- setState(379);
+ setState(382);
postfix();
}
}
}
- setState(384);
+ setState(387);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,28,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,32,_ctx);
}
}
break;
@@ -2878,55 +2945,46 @@ public final ArrayinitializerContext arrayinitializer() throws RecognitionExcept
_localctx = new NewinitializedarrayContext(_localctx);
enterOuterAlt(_localctx, 2);
{
- setState(387);
+ setState(390);
match(NEW);
- setState(388);
+ setState(391);
match(TYPE);
- setState(389);
+ setState(392);
match(LBRACE);
- setState(390);
+ setState(393);
match(RBRACE);
- setState(391);
+ setState(394);
match(LBRACK);
- setState(400);
+ setState(403);
_la = _input.LA(1);
if ((((_la) & ~0x3f) == 0 && ((1L << _la) & ((1L << LBRACE) | (1L << LP) | (1L << NEW) | (1L << BOOLNOT) | (1L << BWNOT) | (1L << ADD) | (1L << SUB) | (1L << INCR) | (1L << DECR))) != 0) || ((((_la - 72)) & ~0x3f) == 0 && ((1L << (_la - 72)) & ((1L << (OCTAL - 72)) | (1L << (HEX - 72)) | (1L << (INTEGER - 72)) | (1L << (DECIMAL - 72)) | (1L << (STRING - 72)) | (1L << (REGEX - 72)) | (1L << (TRUE - 72)) | (1L << (FALSE - 72)) | (1L << (NULL - 72)) | (1L << (TYPE - 72)) | (1L << (ID - 72)))) != 0)) {
{
- setState(392);
+ setState(395);
expression(0);
- setState(397);
+ setState(400);
_errHandler.sync(this);
_la = _input.LA(1);
while (_la==COMMA) {
{
{
- setState(393);
+ setState(396);
match(COMMA);
- setState(394);
+ setState(397);
expression(0);
}
}
- setState(399);
+ setState(402);
_errHandler.sync(this);
_la = _input.LA(1);
}
}
}
- setState(403);
- _la = _input.LA(1);
- if (_la==SEMICOLON) {
- {
- setState(402);
- match(SEMICOLON);
- }
- }
-
setState(405);
match(RBRACK);
setState(409);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,33,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,36,_ctx);
while ( _alt!=2 && _alt!=org.antlr.v4.runtime.atn.ATN.INVALID_ALT_NUMBER ) {
if ( _alt==1 ) {
{
@@ -2938,7 +2996,7 @@ public final ArrayinitializerContext arrayinitializer() throws RecognitionExcept
}
setState(411);
_errHandler.sync(this);
- _alt = getInterpreter().adaptivePredict(_input,33,_ctx);
+ _alt = getInterpreter().adaptivePredict(_input,36,_ctx);
}
}
break;
@@ -2981,12 +3039,12 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final ListinitializerContext listinitializer() throws RecognitionException {
ListinitializerContext _localctx = new ListinitializerContext(_ctx, getState());
- enterRule(_localctx, 48, RULE_listinitializer);
+ enterRule(_localctx, 50, RULE_listinitializer);
int _la;
try {
setState(427);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,36,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,39,_ctx) ) {
case 1:
enterOuterAlt(_localctx, 1);
{
@@ -3063,12 +3121,12 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final MapinitializerContext mapinitializer() throws RecognitionException {
MapinitializerContext _localctx = new MapinitializerContext(_ctx, getState());
- enterRule(_localctx, 50, RULE_mapinitializer);
+ enterRule(_localctx, 52, RULE_mapinitializer);
int _la;
try {
setState(443);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,38,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,41,_ctx) ) {
case 1:
enterOuterAlt(_localctx, 1);
{
@@ -3141,7 +3199,7 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final MaptokenContext maptoken() throws RecognitionException {
MaptokenContext _localctx = new MaptokenContext(_ctx, getState());
- enterRule(_localctx, 52, RULE_maptoken);
+ enterRule(_localctx, 54, RULE_maptoken);
try {
enterOuterAlt(_localctx, 1);
{
@@ -3190,7 +3248,7 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final ArgumentsContext arguments() throws RecognitionException {
ArgumentsContext _localctx = new ArgumentsContext(_ctx, getState());
- enterRule(_localctx, 54, RULE_arguments);
+ enterRule(_localctx, 56, RULE_arguments);
int _la;
try {
enterOuterAlt(_localctx, 1);
@@ -3262,11 +3320,11 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final ArgumentContext argument() throws RecognitionException {
ArgumentContext _localctx = new ArgumentContext(_ctx, getState());
- enterRule(_localctx, 56, RULE_argument);
+ enterRule(_localctx, 58, RULE_argument);
try {
setState(465);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,41,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,44,_ctx) ) {
case 1:
enterOuterAlt(_localctx, 1);
{
@@ -3334,7 +3392,7 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final LambdaContext lambda() throws RecognitionException {
LambdaContext _localctx = new LambdaContext(_ctx, getState());
- enterRule(_localctx, 58, RULE_lambda);
+ enterRule(_localctx, 60, RULE_lambda);
int _la;
try {
enterOuterAlt(_localctx, 1);
@@ -3453,7 +3511,7 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final LamtypeContext lamtype() throws RecognitionException {
LamtypeContext _localctx = new LamtypeContext(_ctx, getState());
- enterRule(_localctx, 60, RULE_lamtype);
+ enterRule(_localctx, 62, RULE_lamtype);
int _la;
try {
enterOuterAlt(_localctx, 1);
@@ -3544,11 +3602,11 @@ public T accept(ParseTreeVisitor extends T> visitor) {
public final FuncrefContext funcref() throws RecognitionException {
FuncrefContext _localctx = new FuncrefContext(_ctx, getState());
- enterRule(_localctx, 62, RULE_funcref);
+ enterRule(_localctx, 64, RULE_funcref);
try {
setState(505);
_errHandler.sync(this);
- switch ( getInterpreter().adaptivePredict(_input,47,_ctx) ) {
+ switch ( getInterpreter().adaptivePredict(_input,50,_ctx) ) {
case 1:
_localctx = new ClassfuncrefContext(_localctx);
enterOuterAlt(_localctx, 1);
@@ -3612,14 +3670,14 @@ public final FuncrefContext funcref() throws RecognitionException {
public boolean sempred(RuleContext _localctx, int ruleIndex, int predIndex) {
switch (ruleIndex) {
- case 3:
- return statement_sempred((StatementContext)_localctx, predIndex);
- case 14:
+ case 4:
+ return rstatement_sempred((RstatementContext)_localctx, predIndex);
+ case 15:
return expression_sempred((ExpressionContext)_localctx, predIndex);
}
return true;
}
- private boolean statement_sempred(StatementContext _localctx, int predIndex) {
+ private boolean rstatement_sempred(RstatementContext _localctx, int predIndex) {
switch (predIndex) {
case 0:
return _input.LA(1) != ELSE ;
@@ -3668,196 +3726,195 @@ private boolean expression_sempred(ExpressionContext _localctx, int predIndex) {
"\13\4\f\t\f\4\r\t\r\4\16\t\16\4\17\t\17\4\20\t\20\4\21\t\21\4\22\t\22"+
"\4\23\t\23\4\24\t\24\4\25\t\25\4\26\t\26\4\27\t\27\4\30\t\30\4\31\t\31"+
"\4\32\t\32\4\33\t\33\4\34\t\34\4\35\t\35\4\36\t\36\4\37\t\37\4 \t \4!"+
- "\t!\3\2\7\2D\n\2\f\2\16\2G\13\2\3\2\7\2J\n\2\f\2\16\2M\13\2\3\2\3\2\3"+
- "\3\3\3\3\3\3\3\3\3\3\4\3\4\3\4\3\4\3\4\3\4\3\4\7\4]\n\4\f\4\16\4`\13\4"+
- "\5\4b\n\4\3\4\3\4\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\5\5n\n\5\3\5\3\5\3\5"+
- "\3\5\3\5\3\5\5\5v\n\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\5\5"+
- "\u0083\n\5\3\5\3\5\5\5\u0087\n\5\3\5\3\5\5\5\u008b\n\5\3\5\3\5\3\5\5\5"+
- "\u0090\n\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5"+
- "\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\3\5\6\5\u00b1"+
- "\n\5\r\5\16\5\u00b2\3\5\3\5\3\5\3\5\3\5\3\5\3\5\5\5\u00bc\n\5\3\6\3\6"+
- "\5\6\u00c0\n\6\3\7\3\7\7\7\u00c4\n\7\f\7\16\7\u00c7\13\7\3\7\3\7\3\b\3"+
- "\b\3\t\3\t\5\t\u00cf\n\t\3\n\3\n\3\13\3\13\3\13\3\13\7\13\u00d7\n\13\f"+
- "\13\16\13\u00da\13\13\3\f\3\f\3\f\7\f\u00df\n\f\f\f\16\f\u00e2\13\f\3"+
- "\r\3\r\3\r\5\r\u00e7\n\r\3\16\3\16\3\16\3\16\3\16\3\16\3\16\3\17\3\17"+
- "\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20"+
- "\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20"+
- "\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20"+
- "\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\20\7\20\u0125\n\20\f\20\16"+
- "\20\u0128\13\20\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21"+
- "\3\21\3\21\5\21\u0137\n\21\3\22\3\22\7\22\u013b\n\22\f\22\16\22\u013e"+
- "\13\22\3\22\3\22\3\22\7\22\u0143\n\22\f\22\16\22\u0146\13\22\3\22\5\22"+
- "\u0149\n\22\3\23\3\23\3\23\3\23\3\23\3\23\3\23\3\23\3\23\3\23\3\23\3\23"+
- "\3\23\3\23\3\23\3\23\3\23\3\23\5\23\u015d\n\23\3\24\3\24\3\24\5\24\u0162"+
- "\n\24\3\25\3\25\5\25\u0166\n\25\3\26\3\26\3\26\3\26\3\27\3\27\3\27\3\30"+
- "\3\30\3\30\3\30\3\31\3\31\3\31\3\31\3\31\3\31\6\31\u0179\n\31\r\31\16"+
- "\31\u017a\3\31\3\31\7\31\u017f\n\31\f\31\16\31\u0182\13\31\5\31\u0184"+
- "\n\31\3\31\3\31\3\31\3\31\3\31\3\31\3\31\3\31\7\31\u018e\n\31\f\31\16"+
- "\31\u0191\13\31\5\31\u0193\n\31\3\31\5\31\u0196\n\31\3\31\3\31\7\31\u019a"+
- "\n\31\f\31\16\31\u019d\13\31\5\31\u019f\n\31\3\32\3\32\3\32\3\32\7\32"+
- "\u01a5\n\32\f\32\16\32\u01a8\13\32\3\32\3\32\3\32\3\32\5\32\u01ae\n\32"+
- "\3\33\3\33\3\33\3\33\7\33\u01b4\n\33\f\33\16\33\u01b7\13\33\3\33\3\33"+
- "\3\33\3\33\3\33\5\33\u01be\n\33\3\34\3\34\3\34\3\34\3\35\3\35\3\35\3\35"+
- "\7\35\u01c8\n\35\f\35\16\35\u01cb\13\35\5\35\u01cd\n\35\3\35\3\35\3\36"+
- "\3\36\3\36\5\36\u01d4\n\36\3\37\3\37\3\37\3\37\3\37\7\37\u01db\n\37\f"+
- "\37\16\37\u01de\13\37\5\37\u01e0\n\37\3\37\5\37\u01e3\n\37\3\37\3\37\3"+
- "\37\5\37\u01e8\n\37\3 \5 \u01eb\n \3 \3 \3!\3!\3!\3!\3!\3!\3!\3!\3!\3"+
- "!\3!\3!\3!\5!\u01fc\n!\3!\2\3\36\"\2\4\6\b\n\f\16\20\22\24\26\30\32\34"+
- "\36 \"$&(*,.\60\62\64\668:<>@\2\17\3\3\16\16\3\2 \"\3\2#$\3\2:;\3\2%\'"+
- "\3\2(+\3\2,/\3\2>I\3\2<=\4\2\36\37#$\3\2JM\3\2\13\f\3\2UV\u0237\2E\3\2"+
- "\2\2\4P\3\2\2\2\6U\3\2\2\2\b\u00bb\3\2\2\2\n\u00bf\3\2\2\2\f\u00c1\3\2"+
- "\2\2\16\u00ca\3\2\2\2\20\u00ce\3\2\2\2\22\u00d0\3\2\2\2\24\u00d2\3\2\2"+
- "\2\26\u00db\3\2\2\2\30\u00e3\3\2\2\2\32\u00e8\3\2\2\2\34\u00ef\3\2\2\2"+
- "\36\u00f1\3\2\2\2 \u0136\3\2\2\2\"\u0148\3\2\2\2$\u015c\3\2\2\2&\u0161"+
- "\3\2\2\2(\u0165\3\2\2\2*\u0167\3\2\2\2,\u016b\3\2\2\2.\u016e\3\2\2\2\60"+
- "\u019e\3\2\2\2\62\u01ad\3\2\2\2\64\u01bd\3\2\2\2\66\u01bf\3\2\2\28\u01c3"+
- "\3\2\2\2:\u01d3\3\2\2\2<\u01e2\3\2\2\2>\u01ea\3\2\2\2@\u01fb\3\2\2\2B"+
- "D\5\4\3\2CB\3\2\2\2DG\3\2\2\2EC\3\2\2\2EF\3\2\2\2FK\3\2\2\2GE\3\2\2\2"+
- "HJ\5\b\5\2IH\3\2\2\2JM\3\2\2\2KI\3\2\2\2KL\3\2\2\2LN\3\2\2\2MK\3\2\2\2"+
- "NO\7\2\2\3O\3\3\2\2\2PQ\5\26\f\2QR\7T\2\2RS\5\6\4\2ST\5\f\7\2T\5\3\2\2"+
- "\2Ua\7\t\2\2VW\5\26\f\2W^\7T\2\2XY\7\r\2\2YZ\5\26\f\2Z[\7T\2\2[]\3\2\2"+
- "\2\\X\3\2\2\2]`\3\2\2\2^\\\3\2\2\2^_\3\2\2\2_b\3\2\2\2`^\3\2\2\2aV\3\2"+
- "\2\2ab\3\2\2\2bc\3\2\2\2cd\7\n\2\2d\7\3\2\2\2ef\7\17\2\2fg\7\t\2\2gh\5"+
- "\36\20\2hi\7\n\2\2im\5\n\6\2jk\7\21\2\2kn\5\n\6\2ln\6\5\2\2mj\3\2\2\2"+
- "ml\3\2\2\2n\u00bc\3\2\2\2op\7\22\2\2pq\7\t\2\2qr\5\36\20\2ru\7\n\2\2s"+
- "v\5\n\6\2tv\5\16\b\2us\3\2\2\2ut\3\2\2\2v\u00bc\3\2\2\2wx\7\23\2\2xy\5"+
- "\f\7\2yz\7\22\2\2z{\7\t\2\2{|\5\36\20\2|}\7\n\2\2}~\5\34\17\2~\u00bc\3"+
- "\2\2\2\177\u0080\7\24\2\2\u0080\u0082\7\t\2\2\u0081\u0083\5\20\t\2\u0082"+
- "\u0081\3\2\2\2\u0082\u0083\3\2\2\2\u0083\u0084\3\2\2\2\u0084\u0086\7\16"+
- "\2\2\u0085\u0087\5\36\20\2\u0086\u0085\3\2\2\2\u0086\u0087\3\2\2\2\u0087"+
- "\u0088\3\2\2\2\u0088\u008a\7\16\2\2\u0089\u008b\5\22\n\2\u008a\u0089\3"+
- "\2\2\2\u008a\u008b\3\2\2\2\u008b\u008c\3\2\2\2\u008c\u008f\7\n\2\2\u008d"+
- "\u0090\5\n\6\2\u008e\u0090\5\16\b\2\u008f\u008d\3\2\2\2\u008f\u008e\3"+
- "\2\2\2\u0090\u00bc\3\2\2\2\u0091\u0092\7\24\2\2\u0092\u0093\7\t\2\2\u0093"+
- "\u0094\5\26\f\2\u0094\u0095\7T\2\2\u0095\u0096\7\66\2\2\u0096\u0097\5"+
- "\36\20\2\u0097\u0098\7\n\2\2\u0098\u0099\5\n\6\2\u0099\u00bc\3\2\2\2\u009a"+
- "\u009b\7\24\2\2\u009b\u009c\7\t\2\2\u009c\u009d\7T\2\2\u009d\u009e\7\20"+
- "\2\2\u009e\u009f\5\36\20\2\u009f\u00a0\7\n\2\2\u00a0\u00a1\5\n\6\2\u00a1"+
- "\u00bc\3\2\2\2\u00a2\u00a3\5\24\13\2\u00a3\u00a4\5\34\17\2\u00a4\u00bc"+
- "\3\2\2\2\u00a5\u00a6\7\25\2\2\u00a6\u00bc\5\34\17\2\u00a7\u00a8\7\26\2"+
- "\2\u00a8\u00bc\5\34\17\2\u00a9\u00aa\7\27\2\2\u00aa\u00ab\5\36\20\2\u00ab"+
- "\u00ac\5\34\17\2\u00ac\u00bc\3\2\2\2\u00ad\u00ae\7\31\2\2\u00ae\u00b0"+
- "\5\f\7\2\u00af\u00b1\5\32\16\2\u00b0\u00af\3\2\2\2\u00b1\u00b2\3\2\2\2"+
- "\u00b2\u00b0\3\2\2\2\u00b2\u00b3\3\2\2\2\u00b3\u00bc\3\2\2\2\u00b4\u00b5"+
- "\7\33\2\2\u00b5\u00b6\5\36\20\2\u00b6\u00b7\5\34\17\2\u00b7\u00bc\3\2"+
- "\2\2\u00b8\u00b9\5\36\20\2\u00b9\u00ba\5\34\17\2\u00ba\u00bc\3\2\2\2\u00bb"+
- "e\3\2\2\2\u00bbo\3\2\2\2\u00bbw\3\2\2\2\u00bb\177\3\2\2\2\u00bb\u0091"+
- "\3\2\2\2\u00bb\u009a\3\2\2\2\u00bb\u00a2\3\2\2\2\u00bb\u00a5\3\2\2\2\u00bb"+
- "\u00a7\3\2\2\2\u00bb\u00a9\3\2\2\2\u00bb\u00ad\3\2\2\2\u00bb\u00b4\3\2"+
- "\2\2\u00bb\u00b8\3\2\2\2\u00bc\t\3\2\2\2\u00bd\u00c0\5\f\7\2\u00be\u00c0"+
- "\5\b\5\2\u00bf\u00bd\3\2\2\2\u00bf\u00be\3\2\2\2\u00c0\13\3\2\2\2\u00c1"+
- "\u00c5\7\5\2\2\u00c2\u00c4\5\b\5\2\u00c3\u00c2\3\2\2\2\u00c4\u00c7\3\2"+
- "\2\2\u00c5\u00c3\3\2\2\2\u00c5\u00c6\3\2\2\2\u00c6\u00c8\3\2\2\2\u00c7"+
- "\u00c5\3\2\2\2\u00c8\u00c9\7\6\2\2\u00c9\r\3\2\2\2\u00ca\u00cb\7\16\2"+
- "\2\u00cb\17\3\2\2\2\u00cc\u00cf\5\24\13\2\u00cd\u00cf\5\36\20\2\u00ce"+
- "\u00cc\3\2\2\2\u00ce\u00cd\3\2\2\2\u00cf\21\3\2\2\2\u00d0\u00d1\5\36\20"+
- "\2\u00d1\23\3\2\2\2\u00d2\u00d3\5\26\f\2\u00d3\u00d8\5\30\r\2\u00d4\u00d5"+
- "\7\r\2\2\u00d5\u00d7\5\30\r\2\u00d6\u00d4\3\2\2\2\u00d7\u00da\3\2\2\2"+
- "\u00d8\u00d6\3\2\2\2\u00d8\u00d9\3\2\2\2\u00d9\25\3\2\2\2\u00da\u00d8"+
- "\3\2\2\2\u00db\u00e0\7S\2\2\u00dc\u00dd\7\7\2\2\u00dd\u00df\7\b\2\2\u00de"+
- "\u00dc\3\2\2\2\u00df\u00e2\3\2\2\2\u00e0\u00de\3\2\2\2\u00e0\u00e1\3\2"+
- "\2\2\u00e1\27\3\2\2\2\u00e2\u00e0\3\2\2\2\u00e3\u00e6\7T\2\2\u00e4\u00e5"+
- "\7>\2\2\u00e5\u00e7\5\36\20\2\u00e6\u00e4\3\2\2\2\u00e6\u00e7\3\2\2\2"+
- "\u00e7\31\3\2\2\2\u00e8\u00e9\7\32\2\2\u00e9\u00ea\7\t\2\2\u00ea\u00eb"+
- "\7S\2\2\u00eb\u00ec\7T\2\2\u00ec\u00ed\7\n\2\2\u00ed\u00ee\5\f\7\2\u00ee"+
- "\33\3\2\2\2\u00ef\u00f0\t\2\2\2\u00f0\35\3\2\2\2\u00f1\u00f2\b\20\1\2"+
- "\u00f2\u00f3\5 \21\2\u00f3\u0126\3\2\2\2\u00f4\u00f5\f\21\2\2\u00f5\u00f6"+
- "\t\3\2\2\u00f6\u0125\5\36\20\22\u00f7\u00f8\f\20\2\2\u00f8\u00f9\t\4\2"+
- "\2\u00f9\u0125\5\36\20\21\u00fa\u00fb\f\17\2\2\u00fb\u00fc\t\5\2\2\u00fc"+
- "\u0125\5\36\20\20\u00fd\u00fe\f\16\2\2\u00fe\u00ff\t\6\2\2\u00ff\u0125"+
- "\5\36\20\17\u0100\u0101\f\r\2\2\u0101\u0102\t\7\2\2\u0102\u0125\5\36\20"+
- "\16\u0103\u0104\f\13\2\2\u0104\u0105\t\b\2\2\u0105\u0125\5\36\20\f\u0106"+
- "\u0107\f\n\2\2\u0107\u0108\7\60\2\2\u0108\u0125\5\36\20\13\u0109\u010a"+
- "\f\t\2\2\u010a\u010b\7\61\2\2\u010b\u0125\5\36\20\n\u010c\u010d\f\b\2"+
- "\2\u010d\u010e\7\62\2\2\u010e\u0125\5\36\20\t\u010f\u0110\f\7\2\2\u0110"+
- "\u0111\7\63\2\2\u0111\u0125\5\36\20\b\u0112\u0113\f\6\2\2\u0113\u0114"+
- "\7\64\2\2\u0114\u0125\5\36\20\7\u0115\u0116\f\5\2\2\u0116\u0117\7\65\2"+
- "\2\u0117\u0118\5\36\20\2\u0118\u0119\7\66\2\2\u0119\u011a\5\36\20\5\u011a"+
- "\u0125\3\2\2\2\u011b\u011c\f\4\2\2\u011c\u011d\7\67\2\2\u011d\u0125\5"+
- "\36\20\4\u011e\u011f\f\3\2\2\u011f\u0120\t\t\2\2\u0120\u0125\5\36\20\3"+
- "\u0121\u0122\f\f\2\2\u0122\u0123\7\35\2\2\u0123\u0125\5\26\f\2\u0124\u00f4"+
- "\3\2\2\2\u0124\u00f7\3\2\2\2\u0124\u00fa\3\2\2\2\u0124\u00fd\3\2\2\2\u0124"+
- "\u0100\3\2\2\2\u0124\u0103\3\2\2\2\u0124\u0106\3\2\2\2\u0124\u0109\3\2"+
- "\2\2\u0124\u010c\3\2\2\2\u0124\u010f\3\2\2\2\u0124\u0112\3\2\2\2\u0124"+
- "\u0115\3\2\2\2\u0124\u011b\3\2\2\2\u0124\u011e\3\2\2\2\u0124\u0121\3\2"+
- "\2\2\u0125\u0128\3\2\2\2\u0126\u0124\3\2\2\2\u0126\u0127\3\2\2\2\u0127"+
- "\37\3\2\2\2\u0128\u0126\3\2\2\2\u0129\u012a\t\n\2\2\u012a\u0137\5\"\22"+
- "\2\u012b\u012c\5\"\22\2\u012c\u012d\t\n\2\2\u012d\u0137\3\2\2\2\u012e"+
- "\u0137\5\"\22\2\u012f\u0130\t\13\2\2\u0130\u0137\5 \21\2\u0131\u0132\7"+
- "\t\2\2\u0132\u0133\5\26\f\2\u0133\u0134\7\n\2\2\u0134\u0135\5 \21\2\u0135"+
- "\u0137\3\2\2\2\u0136\u0129\3\2\2\2\u0136\u012b\3\2\2\2\u0136\u012e\3\2"+
- "\2\2\u0136\u012f\3\2\2\2\u0136\u0131\3\2\2\2\u0137!\3\2\2\2\u0138\u013c"+
- "\5$\23\2\u0139\u013b\5&\24\2\u013a\u0139\3\2\2\2\u013b\u013e\3\2\2\2\u013c"+
- "\u013a\3\2\2\2\u013c\u013d\3\2\2\2\u013d\u0149\3\2\2\2\u013e\u013c\3\2"+
- "\2\2\u013f\u0140\5\26\f\2\u0140\u0144\5(\25\2\u0141\u0143\5&\24\2\u0142"+
- "\u0141\3\2\2\2\u0143\u0146\3\2\2\2\u0144\u0142\3\2\2\2\u0144\u0145\3\2"+
- "\2\2\u0145\u0149\3\2\2\2\u0146\u0144\3\2\2\2\u0147\u0149\5\60\31\2\u0148"+
- "\u0138\3\2\2\2\u0148\u013f\3\2\2\2\u0148\u0147\3\2\2\2\u0149#\3\2\2\2"+
- "\u014a\u014b\7\t\2\2\u014b\u014c\5\36\20\2\u014c\u014d\7\n\2\2\u014d\u015d"+
- "\3\2\2\2\u014e\u015d\t\f\2\2\u014f\u015d\7P\2\2\u0150\u015d\7Q\2\2\u0151"+
- "\u015d\7R\2\2\u0152\u015d\7N\2\2\u0153\u015d\7O\2\2\u0154\u015d\5\62\32"+
- "\2\u0155\u015d\5\64\33\2\u0156\u015d\7T\2\2\u0157\u0158\7T\2\2\u0158\u015d"+
- "\58\35\2\u0159\u015a\7\30\2\2\u015a\u015b\7S\2\2\u015b\u015d\58\35\2\u015c"+
- "\u014a\3\2\2\2\u015c\u014e\3\2\2\2\u015c\u014f\3\2\2\2\u015c\u0150\3\2"+
- "\2\2\u015c\u0151\3\2\2\2\u015c\u0152\3\2\2\2\u015c\u0153\3\2\2\2\u015c"+
- "\u0154\3\2\2\2\u015c\u0155\3\2\2\2\u015c\u0156\3\2\2\2\u015c\u0157\3\2"+
- "\2\2\u015c\u0159\3\2\2\2\u015d%\3\2\2\2\u015e\u0162\5*\26\2\u015f\u0162"+
- "\5,\27\2\u0160\u0162\5.\30\2\u0161\u015e\3\2\2\2\u0161\u015f\3\2\2\2\u0161"+
- "\u0160\3\2\2\2\u0162\'\3\2\2\2\u0163\u0166\5*\26\2\u0164\u0166\5,\27\2"+
- "\u0165\u0163\3\2\2\2\u0165\u0164\3\2\2\2\u0166)\3\2\2\2\u0167\u0168\t"+
- "\r\2\2\u0168\u0169\7V\2\2\u0169\u016a\58\35\2\u016a+\3\2\2\2\u016b\u016c"+
- "\t\r\2\2\u016c\u016d\t\16\2\2\u016d-\3\2\2\2\u016e\u016f\7\7\2\2\u016f"+
- "\u0170\5\36\20\2\u0170\u0171\7\b\2\2\u0171/\3\2\2\2\u0172\u0173\7\30\2"+
- "\2\u0173\u0178\7S\2\2\u0174\u0175\7\7\2\2\u0175\u0176\5\36\20\2\u0176"+
- "\u0177\7\b\2\2\u0177\u0179\3\2\2\2\u0178\u0174\3\2\2\2\u0179\u017a\3\2"+
- "\2\2\u017a\u0178\3\2\2\2\u017a\u017b\3\2\2\2\u017b\u0183\3\2\2\2\u017c"+
- "\u0180\5(\25\2\u017d\u017f\5&\24\2\u017e\u017d\3\2\2\2\u017f\u0182\3\2"+
- "\2\2\u0180\u017e\3\2\2\2\u0180\u0181\3\2\2\2\u0181\u0184\3\2\2\2\u0182"+
- "\u0180\3\2\2\2\u0183\u017c\3\2\2\2\u0183\u0184\3\2\2\2\u0184\u019f\3\2"+
- "\2\2\u0185\u0186\7\30\2\2\u0186\u0187\7S\2\2\u0187\u0188\7\7\2\2\u0188"+
- "\u0189\7\b\2\2\u0189\u0192\7\5\2\2\u018a\u018f\5\36\20\2\u018b\u018c\7"+
- "\r\2\2\u018c\u018e\5\36\20\2\u018d\u018b\3\2\2\2\u018e\u0191\3\2\2\2\u018f"+
- "\u018d\3\2\2\2\u018f\u0190\3\2\2\2\u0190\u0193\3\2\2\2\u0191\u018f\3\2"+
- "\2\2\u0192\u018a\3\2\2\2\u0192\u0193\3\2\2\2\u0193\u0195\3\2\2\2\u0194"+
- "\u0196\7\16\2\2\u0195\u0194\3\2\2\2\u0195\u0196\3\2\2\2\u0196\u0197\3"+
- "\2\2\2\u0197\u019b\7\6\2\2\u0198\u019a\5&\24\2\u0199\u0198\3\2\2\2\u019a"+
+ "\t!\4\"\t\"\3\2\7\2F\n\2\f\2\16\2I\13\2\3\2\7\2L\n\2\f\2\16\2O\13\2\3"+
+ "\2\5\2R\n\2\3\2\3\2\3\3\3\3\3\3\3\3\3\3\3\4\3\4\3\4\3\4\3\4\3\4\3\4\7"+
+ "\4b\n\4\f\4\16\4e\13\4\5\4g\n\4\3\4\3\4\3\5\3\5\3\5\3\5\5\5o\n\5\3\6\3"+
+ "\6\3\6\3\6\3\6\3\6\3\6\3\6\5\6y\n\6\3\6\3\6\3\6\3\6\3\6\3\6\5\6\u0081"+
+ "\n\6\3\6\3\6\3\6\5\6\u0086\n\6\3\6\3\6\5\6\u008a\n\6\3\6\3\6\5\6\u008e"+
+ "\n\6\3\6\3\6\3\6\5\6\u0093\n\6\3\6\3\6\3\6\3\6\3\6\3\6\3\6\3\6\3\6\3\6"+
+ "\3\6\3\6\3\6\3\6\3\6\3\6\3\6\3\6\3\6\3\6\6\6\u00a9\n\6\r\6\16\6\u00aa"+
+ "\5\6\u00ad\n\6\3\7\3\7\3\7\3\7\3\7\3\7\3\7\3\7\3\7\3\7\3\7\3\7\3\7\3\7"+
+ "\3\7\5\7\u00be\n\7\3\b\3\b\5\b\u00c2\n\b\3\t\3\t\7\t\u00c6\n\t\f\t\16"+
+ "\t\u00c9\13\t\3\t\5\t\u00cc\n\t\3\t\3\t\3\n\3\n\3\13\3\13\5\13\u00d4\n"+
+ "\13\3\f\3\f\3\r\3\r\3\r\3\r\7\r\u00dc\n\r\f\r\16\r\u00df\13\r\3\16\3\16"+
+ "\3\16\7\16\u00e4\n\16\f\16\16\16\u00e7\13\16\3\17\3\17\3\17\5\17\u00ec"+
+ "\n\17\3\20\3\20\3\20\3\20\3\20\3\20\3\20\3\21\3\21\3\21\3\21\3\21\3\21"+
+ "\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21"+
+ "\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21"+
+ "\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\21"+
+ "\3\21\3\21\3\21\7\21\u0128\n\21\f\21\16\21\u012b\13\21\3\22\3\22\3\22"+
+ "\3\22\3\22\3\22\3\22\3\22\3\22\3\22\3\22\3\22\3\22\5\22\u013a\n\22\3\23"+
+ "\3\23\7\23\u013e\n\23\f\23\16\23\u0141\13\23\3\23\3\23\3\23\7\23\u0146"+
+ "\n\23\f\23\16\23\u0149\13\23\3\23\5\23\u014c\n\23\3\24\3\24\3\24\3\24"+
+ "\3\24\3\24\3\24\3\24\3\24\3\24\3\24\3\24\3\24\3\24\3\24\3\24\3\24\3\24"+
+ "\5\24\u0160\n\24\3\25\3\25\3\25\5\25\u0165\n\25\3\26\3\26\5\26\u0169\n"+
+ "\26\3\27\3\27\3\27\3\27\3\30\3\30\3\30\3\31\3\31\3\31\3\31\3\32\3\32\3"+
+ "\32\3\32\3\32\3\32\6\32\u017c\n\32\r\32\16\32\u017d\3\32\3\32\7\32\u0182"+
+ "\n\32\f\32\16\32\u0185\13\32\5\32\u0187\n\32\3\32\3\32\3\32\3\32\3\32"+
+ "\3\32\3\32\3\32\7\32\u0191\n\32\f\32\16\32\u0194\13\32\5\32\u0196\n\32"+
+ "\3\32\3\32\7\32\u019a\n\32\f\32\16\32\u019d\13\32\5\32\u019f\n\32\3\33"+
+ "\3\33\3\33\3\33\7\33\u01a5\n\33\f\33\16\33\u01a8\13\33\3\33\3\33\3\33"+
+ "\3\33\5\33\u01ae\n\33\3\34\3\34\3\34\3\34\7\34\u01b4\n\34\f\34\16\34\u01b7"+
+ "\13\34\3\34\3\34\3\34\3\34\3\34\5\34\u01be\n\34\3\35\3\35\3\35\3\35\3"+
+ "\36\3\36\3\36\3\36\7\36\u01c8\n\36\f\36\16\36\u01cb\13\36\5\36\u01cd\n"+
+ "\36\3\36\3\36\3\37\3\37\3\37\5\37\u01d4\n\37\3 \3 \3 \3 \3 \7 \u01db\n"+
+ " \f \16 \u01de\13 \5 \u01e0\n \3 \5 \u01e3\n \3 \3 \3 \5 \u01e8\n \3!"+
+ "\5!\u01eb\n!\3!\3!\3\"\3\"\3\"\3\"\3\"\3\"\3\"\3\"\3\"\3\"\3\"\3\"\3\""+
+ "\5\"\u01fc\n\"\3\"\2\3 #\2\4\6\b\n\f\16\20\22\24\26\30\32\34\36 \"$&("+
+ "*,.\60\62\64\668:<>@B\2\16\3\2 \"\3\2#$\3\2:;\3\2%\'\3\2(+\3\2,/\3\2>"+
+ "I\3\2<=\4\2\36\37#$\3\2JM\3\2\13\f\3\2UV\u0237\2G\3\2\2\2\4U\3\2\2\2\6"+
+ "Z\3\2\2\2\bn\3\2\2\2\n\u00ac\3\2\2\2\f\u00bd\3\2\2\2\16\u00c1\3\2\2\2"+
+ "\20\u00c3\3\2\2\2\22\u00cf\3\2\2\2\24\u00d3\3\2\2\2\26\u00d5\3\2\2\2\30"+
+ "\u00d7\3\2\2\2\32\u00e0\3\2\2\2\34\u00e8\3\2\2\2\36\u00ed\3\2\2\2 \u00f4"+
+ "\3\2\2\2\"\u0139\3\2\2\2$\u014b\3\2\2\2&\u015f\3\2\2\2(\u0164\3\2\2\2"+
+ "*\u0168\3\2\2\2,\u016a\3\2\2\2.\u016e\3\2\2\2\60\u0171\3\2\2\2\62\u019e"+
+ "\3\2\2\2\64\u01ad\3\2\2\2\66\u01bd\3\2\2\28\u01bf\3\2\2\2:\u01c3\3\2\2"+
+ "\2<\u01d3\3\2\2\2>\u01e2\3\2\2\2@\u01ea\3\2\2\2B\u01fb\3\2\2\2DF\5\4\3"+
+ "\2ED\3\2\2\2FI\3\2\2\2GE\3\2\2\2GH\3\2\2\2HM\3\2\2\2IG\3\2\2\2JL\5\b\5"+
+ "\2KJ\3\2\2\2LO\3\2\2\2MK\3\2\2\2MN\3\2\2\2NQ\3\2\2\2OM\3\2\2\2PR\5\f\7"+
+ "\2QP\3\2\2\2QR\3\2\2\2RS\3\2\2\2ST\7\2\2\3T\3\3\2\2\2UV\5\32\16\2VW\7"+
+ "T\2\2WX\5\6\4\2XY\5\20\t\2Y\5\3\2\2\2Zf\7\t\2\2[\\\5\32\16\2\\c\7T\2\2"+
+ "]^\7\r\2\2^_\5\32\16\2_`\7T\2\2`b\3\2\2\2a]\3\2\2\2be\3\2\2\2ca\3\2\2"+
+ "\2cd\3\2\2\2dg\3\2\2\2ec\3\2\2\2f[\3\2\2\2fg\3\2\2\2gh\3\2\2\2hi\7\n\2"+
+ "\2i\7\3\2\2\2jo\5\n\6\2kl\5\f\7\2lm\7\16\2\2mo\3\2\2\2nj\3\2\2\2nk\3\2"+
+ "\2\2o\t\3\2\2\2pq\7\17\2\2qr\7\t\2\2rs\5 \21\2st\7\n\2\2tx\5\16\b\2uv"+
+ "\7\21\2\2vy\5\16\b\2wy\6\6\2\2xu\3\2\2\2xw\3\2\2\2y\u00ad\3\2\2\2z{\7"+
+ "\22\2\2{|\7\t\2\2|}\5 \21\2}\u0080\7\n\2\2~\u0081\5\16\b\2\177\u0081\5"+
+ "\22\n\2\u0080~\3\2\2\2\u0080\177\3\2\2\2\u0081\u00ad\3\2\2\2\u0082\u0083"+
+ "\7\24\2\2\u0083\u0085\7\t\2\2\u0084\u0086\5\24\13\2\u0085\u0084\3\2\2"+
+ "\2\u0085\u0086\3\2\2\2\u0086\u0087\3\2\2\2\u0087\u0089\7\16\2\2\u0088"+
+ "\u008a\5 \21\2\u0089\u0088\3\2\2\2\u0089\u008a\3\2\2\2\u008a\u008b\3\2"+
+ "\2\2\u008b\u008d\7\16\2\2\u008c\u008e\5\26\f\2\u008d\u008c\3\2\2\2\u008d"+
+ "\u008e\3\2\2\2\u008e\u008f\3\2\2\2\u008f\u0092\7\n\2\2\u0090\u0093\5\16"+
+ "\b\2\u0091\u0093\5\22\n\2\u0092\u0090\3\2\2\2\u0092\u0091\3\2\2\2\u0093"+
+ "\u00ad\3\2\2\2\u0094\u0095\7\24\2\2\u0095\u0096\7\t\2\2\u0096\u0097\5"+
+ "\32\16\2\u0097\u0098\7T\2\2\u0098\u0099\7\66\2\2\u0099\u009a\5 \21\2\u009a"+
+ "\u009b\7\n\2\2\u009b\u009c\5\16\b\2\u009c\u00ad\3\2\2\2\u009d\u009e\7"+
+ "\24\2\2\u009e\u009f\7\t\2\2\u009f\u00a0\7T\2\2\u00a0\u00a1\7\20\2\2\u00a1"+
+ "\u00a2\5 \21\2\u00a2\u00a3\7\n\2\2\u00a3\u00a4\5\16\b\2\u00a4\u00ad\3"+
+ "\2\2\2\u00a5\u00a6\7\31\2\2\u00a6\u00a8\5\20\t\2\u00a7\u00a9\5\36\20\2"+
+ "\u00a8\u00a7\3\2\2\2\u00a9\u00aa\3\2\2\2\u00aa\u00a8\3\2\2\2\u00aa\u00ab"+
+ "\3\2\2\2\u00ab\u00ad\3\2\2\2\u00acp\3\2\2\2\u00acz\3\2\2\2\u00ac\u0082"+
+ "\3\2\2\2\u00ac\u0094\3\2\2\2\u00ac\u009d\3\2\2\2\u00ac\u00a5\3\2\2\2\u00ad"+
+ "\13\3\2\2\2\u00ae\u00af\7\23\2\2\u00af\u00b0\5\20\t\2\u00b0\u00b1\7\22"+
+ "\2\2\u00b1\u00b2\7\t\2\2\u00b2\u00b3\5 \21\2\u00b3\u00b4\7\n\2\2\u00b4"+
+ "\u00be\3\2\2\2\u00b5\u00be\5\30\r\2\u00b6\u00be\7\25\2\2\u00b7\u00be\7"+
+ "\26\2\2\u00b8\u00b9\7\27\2\2\u00b9\u00be\5 \21\2\u00ba\u00bb\7\33\2\2"+
+ "\u00bb\u00be\5 \21\2\u00bc\u00be\5 \21\2\u00bd\u00ae\3\2\2\2\u00bd\u00b5"+
+ "\3\2\2\2\u00bd\u00b6\3\2\2\2\u00bd\u00b7\3\2\2\2\u00bd\u00b8\3\2\2\2\u00bd"+
+ "\u00ba\3\2\2\2\u00bd\u00bc\3\2\2\2\u00be\r\3\2\2\2\u00bf\u00c2\5\20\t"+
+ "\2\u00c0\u00c2\5\b\5\2\u00c1\u00bf\3\2\2\2\u00c1\u00c0\3\2\2\2\u00c2\17"+
+ "\3\2\2\2\u00c3\u00c7\7\5\2\2\u00c4\u00c6\5\b\5\2\u00c5\u00c4\3\2\2\2\u00c6"+
+ "\u00c9\3\2\2\2\u00c7\u00c5\3\2\2\2\u00c7\u00c8\3\2\2\2\u00c8\u00cb\3\2"+
+ "\2\2\u00c9\u00c7\3\2\2\2\u00ca\u00cc\5\f\7\2\u00cb\u00ca\3\2\2\2\u00cb"+
+ "\u00cc\3\2\2\2\u00cc\u00cd\3\2\2\2\u00cd\u00ce\7\6\2\2\u00ce\21\3\2\2"+
+ "\2\u00cf\u00d0\7\16\2\2\u00d0\23\3\2\2\2\u00d1\u00d4\5\30\r\2\u00d2\u00d4"+
+ "\5 \21\2\u00d3\u00d1\3\2\2\2\u00d3\u00d2\3\2\2\2\u00d4\25\3\2\2\2\u00d5"+
+ "\u00d6\5 \21\2\u00d6\27\3\2\2\2\u00d7\u00d8\5\32\16\2\u00d8\u00dd\5\34"+
+ "\17\2\u00d9\u00da\7\r\2\2\u00da\u00dc\5\34\17\2\u00db\u00d9\3\2\2\2\u00dc"+
+ "\u00df\3\2\2\2\u00dd\u00db\3\2\2\2\u00dd\u00de\3\2\2\2\u00de\31\3\2\2"+
+ "\2\u00df\u00dd\3\2\2\2\u00e0\u00e5\7S\2\2\u00e1\u00e2\7\7\2\2\u00e2\u00e4"+
+ "\7\b\2\2\u00e3\u00e1\3\2\2\2\u00e4\u00e7\3\2\2\2\u00e5\u00e3\3\2\2\2\u00e5"+
+ "\u00e6\3\2\2\2\u00e6\33\3\2\2\2\u00e7\u00e5\3\2\2\2\u00e8\u00eb\7T\2\2"+
+ "\u00e9\u00ea\7>\2\2\u00ea\u00ec\5 \21\2\u00eb\u00e9\3\2\2\2\u00eb\u00ec"+
+ "\3\2\2\2\u00ec\35\3\2\2\2\u00ed\u00ee\7\32\2\2\u00ee\u00ef\7\t\2\2\u00ef"+
+ "\u00f0\7S\2\2\u00f0\u00f1\7T\2\2\u00f1\u00f2\7\n\2\2\u00f2\u00f3\5\20"+
+ "\t\2\u00f3\37\3\2\2\2\u00f4\u00f5\b\21\1\2\u00f5\u00f6\5\"\22\2\u00f6"+
+ "\u0129\3\2\2\2\u00f7\u00f8\f\21\2\2\u00f8\u00f9\t\2\2\2\u00f9\u0128\5"+
+ " \21\22\u00fa\u00fb\f\20\2\2\u00fb\u00fc\t\3\2\2\u00fc\u0128\5 \21\21"+
+ "\u00fd\u00fe\f\17\2\2\u00fe\u00ff\t\4\2\2\u00ff\u0128\5 \21\20\u0100\u0101"+
+ "\f\16\2\2\u0101\u0102\t\5\2\2\u0102\u0128\5 \21\17\u0103\u0104\f\r\2\2"+
+ "\u0104\u0105\t\6\2\2\u0105\u0128\5 \21\16\u0106\u0107\f\13\2\2\u0107\u0108"+
+ "\t\7\2\2\u0108\u0128\5 \21\f\u0109\u010a\f\n\2\2\u010a\u010b\7\60\2\2"+
+ "\u010b\u0128\5 \21\13\u010c\u010d\f\t\2\2\u010d\u010e\7\61\2\2\u010e\u0128"+
+ "\5 \21\n\u010f\u0110\f\b\2\2\u0110\u0111\7\62\2\2\u0111\u0128\5 \21\t"+
+ "\u0112\u0113\f\7\2\2\u0113\u0114\7\63\2\2\u0114\u0128\5 \21\b\u0115\u0116"+
+ "\f\6\2\2\u0116\u0117\7\64\2\2\u0117\u0128\5 \21\7\u0118\u0119\f\5\2\2"+
+ "\u0119\u011a\7\65\2\2\u011a\u011b\5 \21\2\u011b\u011c\7\66\2\2\u011c\u011d"+
+ "\5 \21\5\u011d\u0128\3\2\2\2\u011e\u011f\f\4\2\2\u011f\u0120\7\67\2\2"+
+ "\u0120\u0128\5 \21\4\u0121\u0122\f\3\2\2\u0122\u0123\t\b\2\2\u0123\u0128"+
+ "\5 \21\3\u0124\u0125\f\f\2\2\u0125\u0126\7\35\2\2\u0126\u0128\5\32\16"+
+ "\2\u0127\u00f7\3\2\2\2\u0127\u00fa\3\2\2\2\u0127\u00fd\3\2\2\2\u0127\u0100"+
+ "\3\2\2\2\u0127\u0103\3\2\2\2\u0127\u0106\3\2\2\2\u0127\u0109\3\2\2\2\u0127"+
+ "\u010c\3\2\2\2\u0127\u010f\3\2\2\2\u0127\u0112\3\2\2\2\u0127\u0115\3\2"+
+ "\2\2\u0127\u0118\3\2\2\2\u0127\u011e\3\2\2\2\u0127\u0121\3\2\2\2\u0127"+
+ "\u0124\3\2\2\2\u0128\u012b\3\2\2\2\u0129\u0127\3\2\2\2\u0129\u012a\3\2"+
+ "\2\2\u012a!\3\2\2\2\u012b\u0129\3\2\2\2\u012c\u012d\t\t\2\2\u012d\u013a"+
+ "\5$\23\2\u012e\u012f\5$\23\2\u012f\u0130\t\t\2\2\u0130\u013a\3\2\2\2\u0131"+
+ "\u013a\5$\23\2\u0132\u0133\t\n\2\2\u0133\u013a\5\"\22\2\u0134\u0135\7"+
+ "\t\2\2\u0135\u0136\5\32\16\2\u0136\u0137\7\n\2\2\u0137\u0138\5\"\22\2"+
+ "\u0138\u013a\3\2\2\2\u0139\u012c\3\2\2\2\u0139\u012e\3\2\2\2\u0139\u0131"+
+ "\3\2\2\2\u0139\u0132\3\2\2\2\u0139\u0134\3\2\2\2\u013a#\3\2\2\2\u013b"+
+ "\u013f\5&\24\2\u013c\u013e\5(\25\2\u013d\u013c\3\2\2\2\u013e\u0141\3\2"+
+ "\2\2\u013f\u013d\3\2\2\2\u013f\u0140\3\2\2\2\u0140\u014c\3\2\2\2\u0141"+
+ "\u013f\3\2\2\2\u0142\u0143\5\32\16\2\u0143\u0147\5*\26\2\u0144\u0146\5"+
+ "(\25\2\u0145\u0144\3\2\2\2\u0146\u0149\3\2\2\2\u0147\u0145\3\2\2\2\u0147"+
+ "\u0148\3\2\2\2\u0148\u014c\3\2\2\2\u0149\u0147\3\2\2\2\u014a\u014c\5\62"+
+ "\32\2\u014b\u013b\3\2\2\2\u014b\u0142\3\2\2\2\u014b\u014a\3\2\2\2\u014c"+
+ "%\3\2\2\2\u014d\u014e\7\t\2\2\u014e\u014f\5 \21\2\u014f\u0150\7\n\2\2"+
+ "\u0150\u0160\3\2\2\2\u0151\u0160\t\13\2\2\u0152\u0160\7P\2\2\u0153\u0160"+
+ "\7Q\2\2\u0154\u0160\7R\2\2\u0155\u0160\7N\2\2\u0156\u0160\7O\2\2\u0157"+
+ "\u0160\5\64\33\2\u0158\u0160\5\66\34\2\u0159\u0160\7T\2\2\u015a\u015b"+
+ "\7T\2\2\u015b\u0160\5:\36\2\u015c\u015d\7\30\2\2\u015d\u015e\7S\2\2\u015e"+
+ "\u0160\5:\36\2\u015f\u014d\3\2\2\2\u015f\u0151\3\2\2\2\u015f\u0152\3\2"+
+ "\2\2\u015f\u0153\3\2\2\2\u015f\u0154\3\2\2\2\u015f\u0155\3\2\2\2\u015f"+
+ "\u0156\3\2\2\2\u015f\u0157\3\2\2\2\u015f\u0158\3\2\2\2\u015f\u0159\3\2"+
+ "\2\2\u015f\u015a\3\2\2\2\u015f\u015c\3\2\2\2\u0160\'\3\2\2\2\u0161\u0165"+
+ "\5,\27\2\u0162\u0165\5.\30\2\u0163\u0165\5\60\31\2\u0164\u0161\3\2\2\2"+
+ "\u0164\u0162\3\2\2\2\u0164\u0163\3\2\2\2\u0165)\3\2\2\2\u0166\u0169\5"+
+ ",\27\2\u0167\u0169\5.\30\2\u0168\u0166\3\2\2\2\u0168\u0167\3\2\2\2\u0169"+
+ "+\3\2\2\2\u016a\u016b\t\f\2\2\u016b\u016c\7V\2\2\u016c\u016d\5:\36\2\u016d"+
+ "-\3\2\2\2\u016e\u016f\t\f\2\2\u016f\u0170\t\r\2\2\u0170/\3\2\2\2\u0171"+
+ "\u0172\7\7\2\2\u0172\u0173\5 \21\2\u0173\u0174\7\b\2\2\u0174\61\3\2\2"+
+ "\2\u0175\u0176\7\30\2\2\u0176\u017b\7S\2\2\u0177\u0178\7\7\2\2\u0178\u0179"+
+ "\5 \21\2\u0179\u017a\7\b\2\2\u017a\u017c\3\2\2\2\u017b\u0177\3\2\2\2\u017c"+
+ "\u017d\3\2\2\2\u017d\u017b\3\2\2\2\u017d\u017e\3\2\2\2\u017e\u0186\3\2"+
+ "\2\2\u017f\u0183\5*\26\2\u0180\u0182\5(\25\2\u0181\u0180\3\2\2\2\u0182"+
+ "\u0185\3\2\2\2\u0183\u0181\3\2\2\2\u0183\u0184\3\2\2\2\u0184\u0187\3\2"+
+ "\2\2\u0185\u0183\3\2\2\2\u0186\u017f\3\2\2\2\u0186\u0187\3\2\2\2\u0187"+
+ "\u019f\3\2\2\2\u0188\u0189\7\30\2\2\u0189\u018a\7S\2\2\u018a\u018b\7\7"+
+ "\2\2\u018b\u018c\7\b\2\2\u018c\u0195\7\5\2\2\u018d\u0192\5 \21\2\u018e"+
+ "\u018f\7\r\2\2\u018f\u0191\5 \21\2\u0190\u018e\3\2\2\2\u0191\u0194\3\2"+
+ "\2\2\u0192\u0190\3\2\2\2\u0192\u0193\3\2\2\2\u0193\u0196\3\2\2\2\u0194"+
+ "\u0192\3\2\2\2\u0195\u018d\3\2\2\2\u0195\u0196\3\2\2\2\u0196\u0197\3\2"+
+ "\2\2\u0197\u019b\7\6\2\2\u0198\u019a\5(\25\2\u0199\u0198\3\2\2\2\u019a"+
"\u019d\3\2\2\2\u019b\u0199\3\2\2\2\u019b\u019c\3\2\2\2\u019c\u019f\3\2"+
- "\2\2\u019d\u019b\3\2\2\2\u019e\u0172\3\2\2\2\u019e\u0185\3\2\2\2\u019f"+
- "\61\3\2\2\2\u01a0\u01a1\7\7\2\2\u01a1\u01a6\5\36\20\2\u01a2\u01a3\7\r"+
- "\2\2\u01a3\u01a5\5\36\20\2\u01a4\u01a2\3\2\2\2\u01a5\u01a8\3\2\2\2\u01a6"+
- "\u01a4\3\2\2\2\u01a6\u01a7\3\2\2\2\u01a7\u01a9\3\2\2\2\u01a8\u01a6\3\2"+
- "\2\2\u01a9\u01aa\7\b\2\2\u01aa\u01ae\3\2\2\2\u01ab\u01ac\7\7\2\2\u01ac"+
- "\u01ae\7\b\2\2\u01ad\u01a0\3\2\2\2\u01ad\u01ab\3\2\2\2\u01ae\63\3\2\2"+
- "\2\u01af\u01b0\7\7\2\2\u01b0\u01b5\5\66\34\2\u01b1\u01b2\7\r\2\2\u01b2"+
- "\u01b4\5\66\34\2\u01b3\u01b1\3\2\2\2\u01b4\u01b7\3\2\2\2\u01b5\u01b3\3"+
- "\2\2\2\u01b5\u01b6\3\2\2\2\u01b6\u01b8\3\2\2\2\u01b7\u01b5\3\2\2\2\u01b8"+
- "\u01b9\7\b\2\2\u01b9\u01be\3\2\2\2\u01ba\u01bb\7\7\2\2\u01bb\u01bc\7\66"+
- "\2\2\u01bc\u01be\7\b\2\2\u01bd\u01af\3\2\2\2\u01bd\u01ba\3\2\2\2\u01be"+
- "\65\3\2\2\2\u01bf\u01c0\5\36\20\2\u01c0\u01c1\7\66\2\2\u01c1\u01c2\5\36"+
- "\20\2\u01c2\67\3\2\2\2\u01c3\u01cc\7\t\2\2\u01c4\u01c9\5:\36\2\u01c5\u01c6"+
- "\7\r\2\2\u01c6\u01c8\5:\36\2\u01c7\u01c5\3\2\2\2\u01c8\u01cb\3\2\2\2\u01c9"+
- "\u01c7\3\2\2\2\u01c9\u01ca\3\2\2\2\u01ca\u01cd\3\2\2\2\u01cb\u01c9\3\2"+
- "\2\2\u01cc\u01c4\3\2\2\2\u01cc\u01cd\3\2\2\2\u01cd\u01ce\3\2\2\2\u01ce"+
- "\u01cf\7\n\2\2\u01cf9\3\2\2\2\u01d0\u01d4\5\36\20\2\u01d1\u01d4\5<\37"+
- "\2\u01d2\u01d4\5@!\2\u01d3\u01d0\3\2\2\2\u01d3\u01d1\3\2\2\2\u01d3\u01d2"+
- "\3\2\2\2\u01d4;\3\2\2\2\u01d5\u01e3\5> \2\u01d6\u01df\7\t\2\2\u01d7\u01dc"+
- "\5> \2\u01d8\u01d9\7\r\2\2\u01d9\u01db\5> \2\u01da\u01d8\3\2\2\2\u01db"+
- "\u01de\3\2\2\2\u01dc\u01da\3\2\2\2\u01dc\u01dd\3\2\2\2\u01dd\u01e0\3\2"+
- "\2\2\u01de\u01dc\3\2\2\2\u01df\u01d7\3\2\2\2\u01df\u01e0\3\2\2\2\u01e0"+
- "\u01e1\3\2\2\2\u01e1\u01e3\7\n\2\2\u01e2\u01d5\3\2\2\2\u01e2\u01d6\3\2"+
- "\2\2\u01e3\u01e4\3\2\2\2\u01e4\u01e7\79\2\2\u01e5\u01e8\5\f\7\2\u01e6"+
- "\u01e8\5\36\20\2\u01e7\u01e5\3\2\2\2\u01e7\u01e6\3\2\2\2\u01e8=\3\2\2"+
- "\2\u01e9\u01eb\5\26\f\2\u01ea\u01e9\3\2\2\2\u01ea\u01eb\3\2\2\2\u01eb"+
- "\u01ec\3\2\2\2\u01ec\u01ed\7T\2\2\u01ed?\3\2\2\2\u01ee\u01ef\7S\2\2\u01ef"+
- "\u01f0\78\2\2\u01f0\u01fc\7T\2\2\u01f1\u01f2\5\26\f\2\u01f2\u01f3\78\2"+
- "\2\u01f3\u01f4\7\30\2\2\u01f4\u01fc\3\2\2\2\u01f5\u01f6\7T\2\2\u01f6\u01f7"+
- "\78\2\2\u01f7\u01fc\7T\2\2\u01f8\u01f9\7\34\2\2\u01f9\u01fa\78\2\2\u01fa"+
- "\u01fc\7T\2\2\u01fb\u01ee\3\2\2\2\u01fb\u01f1\3\2\2\2\u01fb\u01f5\3\2"+
- "\2\2\u01fb\u01f8\3\2\2\2\u01fcA\3\2\2\2\62EK^amu\u0082\u0086\u008a\u008f"+
- "\u00b2\u00bb\u00bf\u00c5\u00ce\u00d8\u00e0\u00e6\u0124\u0126\u0136\u013c"+
- "\u0144\u0148\u015c\u0161\u0165\u017a\u0180\u0183\u018f\u0192\u0195\u019b"+
- "\u019e\u01a6\u01ad\u01b5\u01bd\u01c9\u01cc\u01d3\u01dc\u01df\u01e2\u01e7"+
- "\u01ea\u01fb";
+ "\2\2\u019d\u019b\3\2\2\2\u019e\u0175\3\2\2\2\u019e\u0188\3\2\2\2\u019f"+
+ "\63\3\2\2\2\u01a0\u01a1\7\7\2\2\u01a1\u01a6\5 \21\2\u01a2\u01a3\7\r\2"+
+ "\2\u01a3\u01a5\5 \21\2\u01a4\u01a2\3\2\2\2\u01a5\u01a8\3\2\2\2\u01a6\u01a4"+
+ "\3\2\2\2\u01a6\u01a7\3\2\2\2\u01a7\u01a9\3\2\2\2\u01a8\u01a6\3\2\2\2\u01a9"+
+ "\u01aa\7\b\2\2\u01aa\u01ae\3\2\2\2\u01ab\u01ac\7\7\2\2\u01ac\u01ae\7\b"+
+ "\2\2\u01ad\u01a0\3\2\2\2\u01ad\u01ab\3\2\2\2\u01ae\65\3\2\2\2\u01af\u01b0"+
+ "\7\7\2\2\u01b0\u01b5\58\35\2\u01b1\u01b2\7\r\2\2\u01b2\u01b4\58\35\2\u01b3"+
+ "\u01b1\3\2\2\2\u01b4\u01b7\3\2\2\2\u01b5\u01b3\3\2\2\2\u01b5\u01b6\3\2"+
+ "\2\2\u01b6\u01b8\3\2\2\2\u01b7\u01b5\3\2\2\2\u01b8\u01b9\7\b\2\2\u01b9"+
+ "\u01be\3\2\2\2\u01ba\u01bb\7\7\2\2\u01bb\u01bc\7\66\2\2\u01bc\u01be\7"+
+ "\b\2\2\u01bd\u01af\3\2\2\2\u01bd\u01ba\3\2\2\2\u01be\67\3\2\2\2\u01bf"+
+ "\u01c0\5 \21\2\u01c0\u01c1\7\66\2\2\u01c1\u01c2\5 \21\2\u01c29\3\2\2\2"+
+ "\u01c3\u01cc\7\t\2\2\u01c4\u01c9\5<\37\2\u01c5\u01c6\7\r\2\2\u01c6\u01c8"+
+ "\5<\37\2\u01c7\u01c5\3\2\2\2\u01c8\u01cb\3\2\2\2\u01c9\u01c7\3\2\2\2\u01c9"+
+ "\u01ca\3\2\2\2\u01ca\u01cd\3\2\2\2\u01cb\u01c9\3\2\2\2\u01cc\u01c4\3\2"+
+ "\2\2\u01cc\u01cd\3\2\2\2\u01cd\u01ce\3\2\2\2\u01ce\u01cf\7\n\2\2\u01cf"+
+ ";\3\2\2\2\u01d0\u01d4\5 \21\2\u01d1\u01d4\5> \2\u01d2\u01d4\5B\"\2\u01d3"+
+ "\u01d0\3\2\2\2\u01d3\u01d1\3\2\2\2\u01d3\u01d2\3\2\2\2\u01d4=\3\2\2\2"+
+ "\u01d5\u01e3\5@!\2\u01d6\u01df\7\t\2\2\u01d7\u01dc\5@!\2\u01d8\u01d9\7"+
+ "\r\2\2\u01d9\u01db\5@!\2\u01da\u01d8\3\2\2\2\u01db\u01de\3\2\2\2\u01dc"+
+ "\u01da\3\2\2\2\u01dc\u01dd\3\2\2\2\u01dd\u01e0\3\2\2\2\u01de\u01dc\3\2"+
+ "\2\2\u01df\u01d7\3\2\2\2\u01df\u01e0\3\2\2\2\u01e0\u01e1\3\2\2\2\u01e1"+
+ "\u01e3\7\n\2\2\u01e2\u01d5\3\2\2\2\u01e2\u01d6\3\2\2\2\u01e3\u01e4\3\2"+
+ "\2\2\u01e4\u01e7\79\2\2\u01e5\u01e8\5\20\t\2\u01e6\u01e8\5 \21\2\u01e7"+
+ "\u01e5\3\2\2\2\u01e7\u01e6\3\2\2\2\u01e8?\3\2\2\2\u01e9\u01eb\5\32\16"+
+ "\2\u01ea\u01e9\3\2\2\2\u01ea\u01eb\3\2\2\2\u01eb\u01ec\3\2\2\2\u01ec\u01ed"+
+ "\7T\2\2\u01edA\3\2\2\2\u01ee\u01ef\7S\2\2\u01ef\u01f0\78\2\2\u01f0\u01fc"+
+ "\7T\2\2\u01f1\u01f2\5\32\16\2\u01f2\u01f3\78\2\2\u01f3\u01f4\7\30\2\2"+
+ "\u01f4\u01fc\3\2\2\2\u01f5\u01f6\7T\2\2\u01f6\u01f7\78\2\2\u01f7\u01fc"+
+ "\7T\2\2\u01f8\u01f9\7\34\2\2\u01f9\u01fa\78\2\2\u01fa\u01fc\7T\2\2\u01fb"+
+ "\u01ee\3\2\2\2\u01fb\u01f1\3\2\2\2\u01fb\u01f5\3\2\2\2\u01fb\u01f8\3\2"+
+ "\2\2\u01fcC\3\2\2\2\65GMQcfnx\u0080\u0085\u0089\u008d\u0092\u00aa\u00ac"+
+ "\u00bd\u00c1\u00c7\u00cb\u00d3\u00dd\u00e5\u00eb\u0127\u0129\u0139\u013f"+
+ "\u0147\u014b\u015f\u0164\u0168\u017d\u0183\u0186\u0192\u0195\u019b\u019e"+
+ "\u01a6\u01ad\u01b5\u01bd\u01c9\u01cc\u01d3\u01dc\u01df\u01e2\u01e7\u01ea"+
+ "\u01fb";
public static final ATN _ATN =
new ATNDeserializer().deserialize(_serializedATN.toCharArray());
static {
diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParserBaseVisitor.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParserBaseVisitor.java
index 8c4741e672533..81e7166d9a9ae 100644
--- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParserBaseVisitor.java
+++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParserBaseVisitor.java
@@ -38,21 +38,21 @@ class PainlessParserBaseVisitor extends AbstractParseTreeVisitor implement
*
The default implementation returns the result of calling
* {@link #visitChildren} on {@code ctx}.
*/
- @Override public T visitIf(PainlessParser.IfContext ctx) { return visitChildren(ctx); }
+ @Override public T visitStatement(PainlessParser.StatementContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
*
The default implementation returns the result of calling
* {@link #visitChildren} on {@code ctx}.
*/
- @Override public T visitWhile(PainlessParser.WhileContext ctx) { return visitChildren(ctx); }
+ @Override public T visitIf(PainlessParser.IfContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
*
The default implementation returns the result of calling
* {@link #visitChildren} on {@code ctx}.
*/
- @Override public T visitDo(PainlessParser.DoContext ctx) { return visitChildren(ctx); }
+ @Override public T visitWhile(PainlessParser.WhileContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
@@ -80,35 +80,42 @@ class PainlessParserBaseVisitor extends AbstractParseTreeVisitor implement
*
The default implementation returns the result of calling
* {@link #visitChildren} on {@code ctx}.
*/
- @Override public T visitDecl(PainlessParser.DeclContext ctx) { return visitChildren(ctx); }
+ @Override public T visitTry(PainlessParser.TryContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
*
The default implementation returns the result of calling
* {@link #visitChildren} on {@code ctx}.
*/
- @Override public T visitContinue(PainlessParser.ContinueContext ctx) { return visitChildren(ctx); }
+ @Override public T visitDo(PainlessParser.DoContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
*
The default implementation returns the result of calling
* {@link #visitChildren} on {@code ctx}.
*/
- @Override public T visitBreak(PainlessParser.BreakContext ctx) { return visitChildren(ctx); }
+ @Override public T visitDecl(PainlessParser.DeclContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
*
The default implementation returns the result of calling
* {@link #visitChildren} on {@code ctx}.
*/
- @Override public T visitReturn(PainlessParser.ReturnContext ctx) { return visitChildren(ctx); }
+ @Override public T visitContinue(PainlessParser.ContinueContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
*
The default implementation returns the result of calling
* {@link #visitChildren} on {@code ctx}.
*/
- @Override public T visitTry(PainlessParser.TryContext ctx) { return visitChildren(ctx); }
+ @Override public T visitBreak(PainlessParser.BreakContext ctx) { return visitChildren(ctx); }
+ /**
+ * {@inheritDoc}
+ *
+ *
The default implementation returns the result of calling
+ * {@link #visitChildren} on {@code ctx}.
+ */
+ @Override public T visitReturn(PainlessParser.ReturnContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
@@ -186,13 +193,6 @@ class PainlessParserBaseVisitor extends AbstractParseTreeVisitor implement
* {@link #visitChildren} on {@code ctx}.
*/
@Override public T visitTrap(PainlessParser.TrapContext ctx) { return visitChildren(ctx); }
- /**
- * {@inheritDoc}
- *
- *
The default implementation returns the result of calling
- * {@link #visitChildren} on {@code ctx}.
- */
- @Override public T visitDelimiter(PainlessParser.DelimiterContext ctx) { return visitChildren(ctx); }
/**
* {@inheritDoc}
*
diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParserVisitor.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParserVisitor.java
index 47bfd4a1d05b9..ec3e251f3e9ad 100644
--- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParserVisitor.java
+++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/PainlessParserVisitor.java
@@ -29,92 +29,98 @@ interface PainlessParserVisitor extends ParseTreeVisitor {
*/
T visitParameters(PainlessParser.ParametersContext ctx);
/**
- * Visit a parse tree produced by the {@code if}
- * labeled alternative in {@link PainlessParser#statement}.
+ * Visit a parse tree produced by {@link PainlessParser#statement}.
* @param ctx the parse tree
* @return the visitor result
*/
- T visitIf(PainlessParser.IfContext ctx);
+ T visitStatement(PainlessParser.StatementContext ctx);
/**
- * Visit a parse tree produced by the {@code while}
- * labeled alternative in {@link PainlessParser#statement}.
+ * Visit a parse tree produced by the {@code if}
+ * labeled alternative in {@link PainlessParser#rstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
- T visitWhile(PainlessParser.WhileContext ctx);
+ T visitIf(PainlessParser.IfContext ctx);
/**
- * Visit a parse tree produced by the {@code do}
- * labeled alternative in {@link PainlessParser#statement}.
+ * Visit a parse tree produced by the {@code while}
+ * labeled alternative in {@link PainlessParser#rstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
- T visitDo(PainlessParser.DoContext ctx);
+ T visitWhile(PainlessParser.WhileContext ctx);
/**
* Visit a parse tree produced by the {@code for}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#rstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
T visitFor(PainlessParser.ForContext ctx);
/**
* Visit a parse tree produced by the {@code each}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#rstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
T visitEach(PainlessParser.EachContext ctx);
/**
* Visit a parse tree produced by the {@code ineach}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#rstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
T visitIneach(PainlessParser.IneachContext ctx);
+ /**
+ * Visit a parse tree produced by the {@code try}
+ * labeled alternative in {@link PainlessParser#rstatement}.
+ * @param ctx the parse tree
+ * @return the visitor result
+ */
+ T visitTry(PainlessParser.TryContext ctx);
+ /**
+ * Visit a parse tree produced by the {@code do}
+ * labeled alternative in {@link PainlessParser#dstatement}.
+ * @param ctx the parse tree
+ * @return the visitor result
+ */
+ T visitDo(PainlessParser.DoContext ctx);
/**
* Visit a parse tree produced by the {@code decl}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#dstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
T visitDecl(PainlessParser.DeclContext ctx);
/**
* Visit a parse tree produced by the {@code continue}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#dstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
T visitContinue(PainlessParser.ContinueContext ctx);
/**
* Visit a parse tree produced by the {@code break}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#dstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
T visitBreak(PainlessParser.BreakContext ctx);
/**
* Visit a parse tree produced by the {@code return}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#dstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
T visitReturn(PainlessParser.ReturnContext ctx);
- /**
- * Visit a parse tree produced by the {@code try}
- * labeled alternative in {@link PainlessParser#statement}.
- * @param ctx the parse tree
- * @return the visitor result
- */
- T visitTry(PainlessParser.TryContext ctx);
/**
* Visit a parse tree produced by the {@code throw}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#dstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
T visitThrow(PainlessParser.ThrowContext ctx);
/**
* Visit a parse tree produced by the {@code expr}
- * labeled alternative in {@link PainlessParser#statement}.
+ * labeled alternative in {@link PainlessParser#dstatement}.
* @param ctx the parse tree
* @return the visitor result
*/
@@ -173,12 +179,6 @@ interface PainlessParserVisitor extends ParseTreeVisitor {
* @return the visitor result
*/
T visitTrap(PainlessParser.TrapContext ctx);
- /**
- * Visit a parse tree produced by {@link PainlessParser#delimiter}.
- * @param ctx the parse tree
- * @return the visitor result
- */
- T visitDelimiter(PainlessParser.DelimiterContext ctx);
/**
* Visit a parse tree produced by the {@code single}
* labeled alternative in {@link PainlessParser#expression}.
diff --git a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/Walker.java b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/Walker.java
index a15f87966eae2..3ac6cb7fd37c4 100644
--- a/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/Walker.java
+++ b/modules/lang-painless/src/main/java/org/elasticsearch/painless/antlr/Walker.java
@@ -56,7 +56,6 @@
import org.elasticsearch.painless.antlr.PainlessParser.DeclarationContext;
import org.elasticsearch.painless.antlr.PainlessParser.DecltypeContext;
import org.elasticsearch.painless.antlr.PainlessParser.DeclvarContext;
-import org.elasticsearch.painless.antlr.PainlessParser.DelimiterContext;
import org.elasticsearch.painless.antlr.PainlessParser.DoContext;
import org.elasticsearch.painless.antlr.PainlessParser.DynamicContext;
import org.elasticsearch.painless.antlr.PainlessParser.EachContext;
@@ -264,6 +263,10 @@ public ANode visitSource(SourceContext ctx) {
statements.add((AStatement)visit(statement));
}
+ if (ctx.dstatement() != null) {
+ statements.add((AStatement)visit(ctx.dstatement()));
+ }
+
return new SSource(scriptClassInfo, settings, sourceName, sourceText, debugStream, (MainMethodReserved)reserved.pop(),
location(ctx), functions, globals, statements);
}
@@ -290,6 +293,10 @@ public ANode visitFunction(FunctionContext ctx) {
statements.add((AStatement)visit(statement));
}
+ if (ctx.block().dstatement() != null) {
+ statements.add((AStatement)visit(ctx.block().dstatement()));
+ }
+
return new SFunction((FunctionReserved)reserved.pop(), location(ctx), rtnType, name,
paramTypes, paramNames, statements, false);
}
@@ -299,6 +306,17 @@ public ANode visitParameters(ParametersContext ctx) {
throw location(ctx).createError(new IllegalStateException("Illegal tree structure."));
}
+ @Override
+ public ANode visitStatement(StatementContext ctx) {
+ if (ctx.rstatement() != null) {
+ return visit(ctx.rstatement());
+ } else if (ctx.dstatement() != null) {
+ return visit(ctx.dstatement());
+ } else {
+ throw location(ctx).createError(new IllegalStateException("Illegal tree structure."));
+ }
+ }
+
@Override
public ANode visitIf(IfContext ctx) {
AExpression expression = (AExpression)visit(ctx.expression());
@@ -446,7 +464,7 @@ public ANode visitTrailer(TrailerContext ctx) {
@Override
public ANode visitBlock(BlockContext ctx) {
- if (ctx.statement().isEmpty()) {
+ if (ctx.statement().isEmpty() && ctx.dstatement() == null) {
return null;
} else {
List statements = new ArrayList<>();
@@ -455,6 +473,10 @@ public ANode visitBlock(BlockContext ctx) {
statements.add((AStatement)visit(statement));
}
+ if (ctx.dstatement() != null) {
+ statements.add((AStatement)visit(ctx.dstatement()));
+ }
+
return new SBlock(location(ctx), statements);
}
}
@@ -514,11 +536,6 @@ public ANode visitTrap(TrapContext ctx) {
return new SCatch(location(ctx), type, name, block);
}
- @Override
- public ANode visitDelimiter(DelimiterContext ctx) {
- throw location(ctx).createError(new IllegalStateException("Illegal tree structure."));
- }
-
@Override
public ANode visitSingle(SingleContext ctx) {
return visit(ctx.unary());
@@ -1074,6 +1091,10 @@ public ANode visitLambda(LambdaContext ctx) {
for (StatementContext statement : ctx.block().statement()) {
statements.add((AStatement)visit(statement));
}
+
+ if (ctx.block().dstatement() != null) {
+ statements.add((AStatement)visit(ctx.block().dstatement()));
+ }
}
FunctionReserved lambdaReserved = (FunctionReserved)reserved.pop();
diff --git a/modules/lang-painless/src/test/java/org/elasticsearch/painless/PainlessExecuteRequestTests.java b/modules/lang-painless/src/test/java/org/elasticsearch/painless/PainlessExecuteRequestTests.java
new file mode 100644
index 0000000000000..488ae0e1643bc
--- /dev/null
+++ b/modules/lang-painless/src/test/java/org/elasticsearch/painless/PainlessExecuteRequestTests.java
@@ -0,0 +1,61 @@
+/*
+ * Licensed to Elasticsearch under one or more contributor
+ * license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright
+ * ownership. Elasticsearch licenses this file to you under
+ * the Apache License, Version 2.0 (the "License"); you may
+ * not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.elasticsearch.painless;
+
+import org.elasticsearch.common.xcontent.XContentParser;
+import org.elasticsearch.script.Script;
+import org.elasticsearch.script.ScriptType;
+import org.elasticsearch.test.AbstractStreamableXContentTestCase;
+
+import java.io.IOException;
+import java.util.Collections;
+
+public class PainlessExecuteRequestTests extends AbstractStreamableXContentTestCase {
+
+ @Override
+ protected PainlessExecuteAction.Request createTestInstance() {
+ Script script = new Script(randomAlphaOfLength(10));
+ PainlessExecuteAction.Request.SupportedContext context = randomBoolean() ?
+ PainlessExecuteAction.Request.SupportedContext.PAINLESS_TEST : null;
+ return new PainlessExecuteAction.Request(script, context);
+ }
+
+ @Override
+ protected PainlessExecuteAction.Request createBlankInstance() {
+ return new PainlessExecuteAction.Request();
+ }
+
+ @Override
+ protected PainlessExecuteAction.Request doParseInstance(XContentParser parser) throws IOException {
+ return PainlessExecuteAction.Request.parse(parser);
+ }
+
+ @Override
+ protected boolean supportsUnknownFields() {
+ return false;
+ }
+
+ public void testValidate() {
+ Script script = new Script(ScriptType.STORED, null, randomAlphaOfLength(10), Collections.emptyMap());
+ PainlessExecuteAction.Request request = new PainlessExecuteAction.Request(script, null);
+ Exception e = request.validate();
+ assertNotNull(e);
+ assertEquals("Validation Failed: 1: only inline scripts are supported;", e.getMessage());
+ }
+}
diff --git a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/TestRatingEnum.java b/modules/lang-painless/src/test/java/org/elasticsearch/painless/PainlessExecuteResponseTests.java
similarity index 60%
rename from modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/TestRatingEnum.java
rename to modules/lang-painless/src/test/java/org/elasticsearch/painless/PainlessExecuteResponseTests.java
index ea44c215d9214..20f3cf08e04c8 100644
--- a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/TestRatingEnum.java
+++ b/modules/lang-painless/src/test/java/org/elasticsearch/painless/PainlessExecuteResponseTests.java
@@ -16,9 +16,19 @@
* specific language governing permissions and limitations
* under the License.
*/
+package org.elasticsearch.painless;
-package org.elasticsearch.index.rankeval;
+import org.elasticsearch.test.AbstractStreamableTestCase;
-enum TestRatingEnum {
- IRRELEVANT, RELEVANT;
-}
\ No newline at end of file
+public class PainlessExecuteResponseTests extends AbstractStreamableTestCase {
+
+ @Override
+ protected PainlessExecuteAction.Response createBlankInstance() {
+ return new PainlessExecuteAction.Response();
+ }
+
+ @Override
+ protected PainlessExecuteAction.Response createTestInstance() {
+ return new PainlessExecuteAction.Response(randomAlphaOfLength(10));
+ }
+}
diff --git a/modules/lang-painless/src/test/java/org/elasticsearch/painless/RegexTests.java b/modules/lang-painless/src/test/java/org/elasticsearch/painless/RegexTests.java
index 0a66b67a2e8ac..911a50468cc17 100644
--- a/modules/lang-painless/src/test/java/org/elasticsearch/painless/RegexTests.java
+++ b/modules/lang-painless/src/test/java/org/elasticsearch/painless/RegexTests.java
@@ -278,6 +278,6 @@ public void testBogusRegexFlag() {
IllegalArgumentException e = expectScriptThrows(IllegalArgumentException.class, () -> {
exec("/asdf/b", false); // Not picky so we get a non-assertion error
});
- assertEquals("unexpected token ['b'] was expecting one of [{, ';'}].", e.getMessage());
+ assertEquals("invalid sequence of tokens near ['b'].", e.getMessage());
}
}
diff --git a/modules/lang-painless/src/test/java/org/elasticsearch/painless/WhenThingsGoWrongTests.java b/modules/lang-painless/src/test/java/org/elasticsearch/painless/WhenThingsGoWrongTests.java
index d60da7b795fbc..1bb754db84745 100644
--- a/modules/lang-painless/src/test/java/org/elasticsearch/painless/WhenThingsGoWrongTests.java
+++ b/modules/lang-painless/src/test/java/org/elasticsearch/painless/WhenThingsGoWrongTests.java
@@ -256,7 +256,7 @@ public void testRCurlyNotDelim() {
// We don't want PICKY here so we get the normal error message
exec("def i = 1} return 1", emptyMap(), emptyMap(), null, false);
});
- assertEquals("unexpected token ['}'] was expecting one of [].", e.getMessage());
+ assertEquals("invalid sequence of tokens near ['}'].", e.getMessage());
}
public void testBadBoxingCast() {
diff --git a/modules/lang-painless/src/test/resources/rest-api-spec/test/painless/50_script_doc_values.yml b/modules/lang-painless/src/test/resources/rest-api-spec/test/painless/50_script_doc_values.yml
index ce8c03afec607..ede2927b992e0 100644
--- a/modules/lang-painless/src/test/resources/rest-api-spec/test/painless/50_script_doc_values.yml
+++ b/modules/lang-painless/src/test/resources/rest-api-spec/test/painless/50_script_doc_values.yml
@@ -3,6 +3,8 @@ setup:
indices.create:
index: test
body:
+ settings:
+ number_of_shards: 1
mappings:
test:
properties:
diff --git a/modules/lang-painless/src/test/resources/rest-api-spec/test/painless/70_execute_painless_scripts.yml b/modules/lang-painless/src/test/resources/rest-api-spec/test/painless/70_execute_painless_scripts.yml
new file mode 100644
index 0000000000000..7b915cc38dbc0
--- /dev/null
+++ b/modules/lang-painless/src/test/resources/rest-api-spec/test/painless/70_execute_painless_scripts.yml
@@ -0,0 +1,25 @@
+---
+"Execute with defaults":
+ - do:
+ scripts_painless_execute:
+ body:
+ script:
+ source: "params.count / params.total"
+ params:
+ count: 100.0
+ total: 1000.0
+ - match: { result: "0.1" }
+
+---
+"Execute with execute_api_script context":
+ - do:
+ scripts_painless_execute:
+ body:
+ script:
+ source: "params.var1 - params.var2"
+ params:
+ var1: 10
+ var2: 100
+ context:
+ painless_test: {}
+ - match: { result: "-90" }
diff --git a/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/MeanReciprocalRank.java b/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/MeanReciprocalRank.java
index 0f51f6d5d6369..eb20dc8c680f9 100644
--- a/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/MeanReciprocalRank.java
+++ b/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/MeanReciprocalRank.java
@@ -128,7 +128,7 @@ public EvalQueryQuality evaluate(String taskId, SearchHit[] hits,
double reciprocalRank = (firstRelevant == -1) ? 0 : 1.0d / firstRelevant;
EvalQueryQuality evalQueryQuality = new EvalQueryQuality(taskId, reciprocalRank);
- evalQueryQuality.setMetricDetails(new Breakdown(firstRelevant));
+ evalQueryQuality.setMetricDetails(new Detail(firstRelevant));
evalQueryQuality.addHitsAndRatings(ratedHits);
return evalQueryQuality;
}
@@ -181,16 +181,16 @@ public final int hashCode() {
return Objects.hash(relevantRatingThreshhold, k);
}
- static class Breakdown implements MetricDetail {
+ public static final class Detail implements MetricDetail {
private final int firstRelevantRank;
private static ParseField FIRST_RELEVANT_RANK_FIELD = new ParseField("first_relevant");
- Breakdown(int firstRelevantRank) {
+ Detail(int firstRelevantRank) {
this.firstRelevantRank = firstRelevantRank;
}
- Breakdown(StreamInput in) throws IOException {
+ Detail(StreamInput in) throws IOException {
this.firstRelevantRank = in.readVInt();
}
@@ -206,15 +206,15 @@ public XContentBuilder innerToXContent(XContentBuilder builder, Params params)
return builder.field(FIRST_RELEVANT_RANK_FIELD.getPreferredName(), firstRelevantRank);
}
- private static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>(NAME, true, args -> {
- return new Breakdown((Integer) args[0]);
+ private static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>(NAME, true, args -> {
+ return new Detail((Integer) args[0]);
});
static {
PARSER.declareInt(constructorArg(), FIRST_RELEVANT_RANK_FIELD);
}
- public static Breakdown fromXContent(XContentParser parser) {
+ public static Detail fromXContent(XContentParser parser) {
return PARSER.apply(parser, null);
}
@@ -232,24 +232,24 @@ public String getWriteableName() {
* the ranking of the first relevant document, or -1 if no relevant document was
* found
*/
- int getFirstRelevantRank() {
+ public int getFirstRelevantRank() {
return firstRelevantRank;
}
@Override
- public final boolean equals(Object obj) {
+ public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || getClass() != obj.getClass()) {
return false;
}
- MeanReciprocalRank.Breakdown other = (MeanReciprocalRank.Breakdown) obj;
+ MeanReciprocalRank.Detail other = (MeanReciprocalRank.Detail) obj;
return Objects.equals(firstRelevantRank, other.firstRelevantRank);
}
@Override
- public final int hashCode() {
+ public int hashCode() {
return Objects.hash(firstRelevantRank);
}
}
diff --git a/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/PrecisionAtK.java b/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/PrecisionAtK.java
index 15d955935eeff..136158ea5cba7 100644
--- a/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/PrecisionAtK.java
+++ b/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/PrecisionAtK.java
@@ -181,7 +181,7 @@ public EvalQueryQuality evaluate(String taskId, SearchHit[] hits,
}
EvalQueryQuality evalQueryQuality = new EvalQueryQuality(taskId, precision);
evalQueryQuality.setMetricDetails(
- new PrecisionAtK.Breakdown(truePositives, truePositives + falsePositives));
+ new PrecisionAtK.Detail(truePositives, truePositives + falsePositives));
evalQueryQuality.addHitsAndRatings(ratedSearchHits);
return evalQueryQuality;
}
@@ -217,19 +217,19 @@ public final int hashCode() {
return Objects.hash(relevantRatingThreshhold, ignoreUnlabeled, k);
}
- static class Breakdown implements MetricDetail {
+ public static final class Detail implements MetricDetail {
private static final ParseField DOCS_RETRIEVED_FIELD = new ParseField("docs_retrieved");
private static final ParseField RELEVANT_DOCS_RETRIEVED_FIELD = new ParseField("relevant_docs_retrieved");
private int relevantRetrieved;
private int retrieved;
- Breakdown(int relevantRetrieved, int retrieved) {
+ Detail(int relevantRetrieved, int retrieved) {
this.relevantRetrieved = relevantRetrieved;
this.retrieved = retrieved;
}
- Breakdown(StreamInput in) throws IOException {
+ Detail(StreamInput in) throws IOException {
this.relevantRetrieved = in.readVInt();
this.retrieved = in.readVInt();
}
@@ -242,8 +242,8 @@ public XContentBuilder innerToXContent(XContentBuilder builder, Params params)
return builder;
}
- private static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>(NAME, true, args -> {
- return new Breakdown((Integer) args[0], (Integer) args[1]);
+ private static final ConstructingObjectParser PARSER = new ConstructingObjectParser<>(NAME, true, args -> {
+ return new Detail((Integer) args[0], (Integer) args[1]);
});
static {
@@ -251,7 +251,7 @@ public XContentBuilder innerToXContent(XContentBuilder builder, Params params)
PARSER.declareInt(constructorArg(), DOCS_RETRIEVED_FIELD);
}
- public static Breakdown fromXContent(XContentParser parser) {
+ public static Detail fromXContent(XContentParser parser) {
return PARSER.apply(parser, null);
}
@@ -266,29 +266,29 @@ public String getWriteableName() {
return NAME;
}
- int getRelevantRetrieved() {
+ public int getRelevantRetrieved() {
return relevantRetrieved;
}
- int getRetrieved() {
+ public int getRetrieved() {
return retrieved;
}
@Override
- public final boolean equals(Object obj) {
+ public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || getClass() != obj.getClass()) {
return false;
}
- PrecisionAtK.Breakdown other = (PrecisionAtK.Breakdown) obj;
+ PrecisionAtK.Detail other = (PrecisionAtK.Detail) obj;
return Objects.equals(relevantRetrieved, other.relevantRetrieved)
&& Objects.equals(retrieved, other.retrieved);
}
@Override
- public final int hashCode() {
+ public int hashCode() {
return Objects.hash(relevantRetrieved, retrieved);
}
}
diff --git a/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/RankEvalNamedXContentProvider.java b/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/RankEvalNamedXContentProvider.java
index 54d68774a016e..c5785ca3847d4 100644
--- a/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/RankEvalNamedXContentProvider.java
+++ b/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/RankEvalNamedXContentProvider.java
@@ -38,9 +38,9 @@ public List getNamedXContentParsers() {
namedXContent.add(new NamedXContentRegistry.Entry(EvaluationMetric.class, new ParseField(DiscountedCumulativeGain.NAME),
DiscountedCumulativeGain::fromXContent));
namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(PrecisionAtK.NAME),
- PrecisionAtK.Breakdown::fromXContent));
+ PrecisionAtK.Detail::fromXContent));
namedXContent.add(new NamedXContentRegistry.Entry(MetricDetail.class, new ParseField(MeanReciprocalRank.NAME),
- MeanReciprocalRank.Breakdown::fromXContent));
+ MeanReciprocalRank.Detail::fromXContent));
return namedXContent;
}
}
diff --git a/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/RankEvalPlugin.java b/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/RankEvalPlugin.java
index d4ccd7c2180fe..884cf3bafdcda 100644
--- a/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/RankEvalPlugin.java
+++ b/modules/rank-eval/src/main/java/org/elasticsearch/index/rankeval/RankEvalPlugin.java
@@ -60,9 +60,9 @@ public List getNamedWriteables() {
namedWriteables.add(new NamedWriteableRegistry.Entry(EvaluationMetric.class, MeanReciprocalRank.NAME, MeanReciprocalRank::new));
namedWriteables.add(
new NamedWriteableRegistry.Entry(EvaluationMetric.class, DiscountedCumulativeGain.NAME, DiscountedCumulativeGain::new));
- namedWriteables.add(new NamedWriteableRegistry.Entry(MetricDetail.class, PrecisionAtK.NAME, PrecisionAtK.Breakdown::new));
+ namedWriteables.add(new NamedWriteableRegistry.Entry(MetricDetail.class, PrecisionAtK.NAME, PrecisionAtK.Detail::new));
namedWriteables
- .add(new NamedWriteableRegistry.Entry(MetricDetail.class, MeanReciprocalRank.NAME, MeanReciprocalRank.Breakdown::new));
+ .add(new NamedWriteableRegistry.Entry(MetricDetail.class, MeanReciprocalRank.NAME, MeanReciprocalRank.Detail::new));
return namedWriteables;
}
diff --git a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/DiscountedCumulativeGainTests.java b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/DiscountedCumulativeGainTests.java
index 22c3542c0fab4..ba03a734ec760 100644
--- a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/DiscountedCumulativeGainTests.java
+++ b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/DiscountedCumulativeGainTests.java
@@ -253,7 +253,7 @@ private void assertParsedCorrect(String xContent, Integer expectedUnknownDocRati
public static DiscountedCumulativeGain createTestItem() {
boolean normalize = randomBoolean();
- Integer unknownDocRating = new Integer(randomIntBetween(0, 1000));
+ Integer unknownDocRating = Integer.valueOf(randomIntBetween(0, 1000));
return new DiscountedCumulativeGain(normalize, unknownDocRating, 10);
}
diff --git a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/EvalQueryQualityTests.java b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/EvalQueryQualityTests.java
index df6de75ba2cb4..112cf4eaaf72e 100644
--- a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/EvalQueryQualityTests.java
+++ b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/EvalQueryQualityTests.java
@@ -69,9 +69,9 @@ public static EvalQueryQuality randomEvalQueryQuality() {
randomDoubleBetween(0.0, 1.0, true));
if (randomBoolean()) {
if (randomBoolean()) {
- evalQueryQuality.setMetricDetails(new PrecisionAtK.Breakdown(randomIntBetween(0, 1000), randomIntBetween(0, 1000)));
+ evalQueryQuality.setMetricDetails(new PrecisionAtK.Detail(randomIntBetween(0, 1000), randomIntBetween(0, 1000)));
} else {
- evalQueryQuality.setMetricDetails(new MeanReciprocalRank.Breakdown(randomIntBetween(0, 1000)));
+ evalQueryQuality.setMetricDetails(new MeanReciprocalRank.Detail(randomIntBetween(0, 1000)));
}
}
evalQueryQuality.addHitsAndRatings(ratedHits);
@@ -137,7 +137,7 @@ private static EvalQueryQuality mutateTestItem(EvalQueryQuality original) {
break;
case 2:
if (metricDetails == null) {
- metricDetails = new PrecisionAtK.Breakdown(1, 5);
+ metricDetails = new PrecisionAtK.Detail(1, 5);
} else {
metricDetails = null;
}
diff --git a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/MeanReciprocalRankTests.java b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/MeanReciprocalRankTests.java
index 6604dbc74a065..c9ff39bbd118a 100644
--- a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/MeanReciprocalRankTests.java
+++ b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/MeanReciprocalRankTests.java
@@ -46,6 +46,9 @@
public class MeanReciprocalRankTests extends ESTestCase {
+ private static final int IRRELEVANT_RATING_0 = 0;
+ private static final int RELEVANT_RATING_1 = 1;
+
public void testParseFromXContent() throws IOException {
String xContent = "{ }";
try (XContentParser parser = createParser(JsonXContent.jsonXContent, xContent)) {
@@ -84,16 +87,16 @@ public void testMaxAcceptableRank() {
int relevantAt = randomIntBetween(0, searchHits);
for (int i = 0; i <= searchHits; i++) {
if (i == relevantAt) {
- ratedDocs.add(new RatedDocument("test", Integer.toString(i), TestRatingEnum.RELEVANT.ordinal()));
+ ratedDocs.add(new RatedDocument("test", Integer.toString(i), RELEVANT_RATING_1));
} else {
- ratedDocs.add(new RatedDocument("test", Integer.toString(i), TestRatingEnum.IRRELEVANT.ordinal()));
+ ratedDocs.add(new RatedDocument("test", Integer.toString(i), IRRELEVANT_RATING_0));
}
}
int rankAtFirstRelevant = relevantAt + 1;
EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, ratedDocs);
assertEquals(1.0 / rankAtFirstRelevant, evaluation.getQualityLevel(), Double.MIN_VALUE);
- assertEquals(rankAtFirstRelevant, ((MeanReciprocalRank.Breakdown) evaluation.getMetricDetails()).getFirstRelevantRank());
+ assertEquals(rankAtFirstRelevant, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank());
// check that if we have fewer search hits than relevant doc position,
// we don't find any result and get 0.0 quality level
@@ -110,15 +113,15 @@ public void testEvaluationOneRelevantInResults() {
int relevantAt = randomIntBetween(0, 9);
for (int i = 0; i <= 20; i++) {
if (i == relevantAt) {
- ratedDocs.add(new RatedDocument("test", Integer.toString(i), TestRatingEnum.RELEVANT.ordinal()));
+ ratedDocs.add(new RatedDocument("test", Integer.toString(i), RELEVANT_RATING_1));
} else {
- ratedDocs.add(new RatedDocument("test", Integer.toString(i), TestRatingEnum.IRRELEVANT.ordinal()));
+ ratedDocs.add(new RatedDocument("test", Integer.toString(i), IRRELEVANT_RATING_0));
}
}
EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, ratedDocs);
assertEquals(1.0 / (relevantAt + 1), evaluation.getQualityLevel(), Double.MIN_VALUE);
- assertEquals(relevantAt + 1, ((MeanReciprocalRank.Breakdown) evaluation.getMetricDetails()).getFirstRelevantRank());
+ assertEquals(relevantAt + 1, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank());
}
/**
@@ -138,7 +141,7 @@ public void testPrecisionAtFiveRelevanceThreshold() {
MeanReciprocalRank reciprocalRank = new MeanReciprocalRank(2, 10);
EvalQueryQuality evaluation = reciprocalRank.evaluate("id", hits, rated);
assertEquals((double) 1 / 3, evaluation.getQualityLevel(), 0.00001);
- assertEquals(3, ((MeanReciprocalRank.Breakdown) evaluation.getMetricDetails()).getFirstRelevantRank());
+ assertEquals(3, ((MeanReciprocalRank.Detail) evaluation.getMetricDetails()).getFirstRelevantRank());
}
public void testCombine() {
@@ -162,7 +165,7 @@ public void testNoResults() throws Exception {
SearchHit[] hits = new SearchHit[0];
EvalQueryQuality evaluated = (new MeanReciprocalRank()).evaluate("id", hits, Collections.emptyList());
assertEquals(0.0d, evaluated.getQualityLevel(), 0.00001);
- assertEquals(-1, ((MeanReciprocalRank.Breakdown) evaluated.getMetricDetails()).getFirstRelevantRank());
+ assertEquals(-1, ((MeanReciprocalRank.Detail) evaluated.getMetricDetails()).getFirstRelevantRank());
}
public void testXContentRoundtrip() throws IOException {
diff --git a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/PrecisionAtKTests.java b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/PrecisionAtKTests.java
index aa3dd5a0b7e32..3efff57920b84 100644
--- a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/PrecisionAtKTests.java
+++ b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/PrecisionAtKTests.java
@@ -46,26 +46,29 @@
public class PrecisionAtKTests extends ESTestCase {
+ private static final int IRRELEVANT_RATING_0 = 0;
+ private static final int RELEVANT_RATING_1 = 1;
+
public void testPrecisionAtFiveCalculation() {
List rated = new ArrayList<>();
- rated.add(createRatedDoc("test", "0", TestRatingEnum.RELEVANT.ordinal()));
+ rated.add(createRatedDoc("test", "0", RELEVANT_RATING_1));
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated, "test"), rated);
assertEquals(1, evaluated.getQualityLevel(), 0.00001);
- assertEquals(1, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(1, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(1, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(1, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
}
public void testPrecisionAtFiveIgnoreOneResult() {
List rated = new ArrayList<>();
- rated.add(createRatedDoc("test", "0", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test", "1", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test", "2", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test", "3", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test", "4", TestRatingEnum.IRRELEVANT.ordinal()));
+ rated.add(createRatedDoc("test", "0", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test", "1", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test", "2", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test", "3", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test", "4", IRRELEVANT_RATING_0));
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated, "test"), rated);
assertEquals((double) 4 / 5, evaluated.getQualityLevel(), 0.00001);
- assertEquals(4, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(5, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(4, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
}
/**
@@ -83,28 +86,28 @@ public void testPrecisionAtFiveRelevanceThreshold() {
PrecisionAtK precisionAtN = new PrecisionAtK(2, false, 5);
EvalQueryQuality evaluated = precisionAtN.evaluate("id", toSearchHits(rated, "test"), rated);
assertEquals((double) 3 / 5, evaluated.getQualityLevel(), 0.00001);
- assertEquals(3, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(5, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
}
public void testPrecisionAtFiveCorrectIndex() {
List rated = new ArrayList<>();
- rated.add(createRatedDoc("test_other", "0", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test_other", "1", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test", "0", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test", "1", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test", "2", TestRatingEnum.IRRELEVANT.ordinal()));
+ rated.add(createRatedDoc("test_other", "0", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test_other", "1", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test", "0", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test", "1", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test", "2", IRRELEVANT_RATING_0));
// the following search hits contain only the last three documents
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", toSearchHits(rated.subList(2, 5), "test"), rated);
assertEquals((double) 2 / 3, evaluated.getQualityLevel(), 0.00001);
- assertEquals(2, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(3, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
}
public void testIgnoreUnlabeled() {
List rated = new ArrayList<>();
- rated.add(createRatedDoc("test", "0", TestRatingEnum.RELEVANT.ordinal()));
- rated.add(createRatedDoc("test", "1", TestRatingEnum.RELEVANT.ordinal()));
+ rated.add(createRatedDoc("test", "0", RELEVANT_RATING_1));
+ rated.add(createRatedDoc("test", "1", RELEVANT_RATING_1));
// add an unlabeled search hit
SearchHit[] searchHits = Arrays.copyOf(toSearchHits(rated, "test"), 3);
searchHits[2] = new SearchHit(2, "2", new Text("testtype"), Collections.emptyMap());
@@ -112,15 +115,15 @@ public void testIgnoreUnlabeled() {
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", searchHits, rated);
assertEquals((double) 2 / 3, evaluated.getQualityLevel(), 0.00001);
- assertEquals(2, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(3, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(3, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
// also try with setting `ignore_unlabeled`
PrecisionAtK prec = new PrecisionAtK(1, true, 10);
evaluated = prec.evaluate("id", searchHits, rated);
assertEquals((double) 2 / 2, evaluated.getQualityLevel(), 0.00001);
- assertEquals(2, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(2, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(2, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
}
public void testNoRatedDocs() throws Exception {
@@ -131,23 +134,23 @@ public void testNoRatedDocs() throws Exception {
}
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", hits, Collections.emptyList());
assertEquals(0.0d, evaluated.getQualityLevel(), 0.00001);
- assertEquals(0, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(5, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(5, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
// also try with setting `ignore_unlabeled`
PrecisionAtK prec = new PrecisionAtK(1, true, 10);
evaluated = prec.evaluate("id", hits, Collections.emptyList());
assertEquals(0.0d, evaluated.getQualityLevel(), 0.00001);
- assertEquals(0, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(0, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
}
public void testNoResults() throws Exception {
SearchHit[] hits = new SearchHit[0];
EvalQueryQuality evaluated = (new PrecisionAtK()).evaluate("id", hits, Collections.emptyList());
assertEquals(0.0d, evaluated.getQualityLevel(), 0.00001);
- assertEquals(0, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRelevantRetrieved());
- assertEquals(0, ((PrecisionAtK.Breakdown) evaluated.getMetricDetails()).getRetrieved());
+ assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRelevantRetrieved());
+ assertEquals(0, ((PrecisionAtK.Detail) evaluated.getMetricDetails()).getRetrieved());
}
public void testParseFromXContent() throws IOException {
diff --git a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RankEvalRequestIT.java b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RankEvalRequestIT.java
index dc0bbddeb62b1..b55c57bae2bcf 100644
--- a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RankEvalRequestIT.java
+++ b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RankEvalRequestIT.java
@@ -20,12 +20,13 @@
package org.elasticsearch.index.rankeval;
import org.elasticsearch.ElasticsearchException;
+import org.elasticsearch.action.admin.indices.alias.IndicesAliasesRequest.AliasActions;
import org.elasticsearch.action.search.SearchRequest;
import org.elasticsearch.action.support.IndicesOptions;
import org.elasticsearch.index.IndexNotFoundException;
import org.elasticsearch.index.query.MatchAllQueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
-import org.elasticsearch.index.rankeval.PrecisionAtK.Breakdown;
+import org.elasticsearch.index.rankeval.PrecisionAtK.Detail;
import org.elasticsearch.indices.IndexClosedException;
import org.elasticsearch.plugins.Plugin;
import org.elasticsearch.search.builder.SearchSourceBuilder;
@@ -40,9 +41,15 @@
import java.util.Set;
import static org.elasticsearch.index.rankeval.EvaluationMetric.filterUnknownDocuments;
+import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
import static org.hamcrest.Matchers.instanceOf;
public class RankEvalRequestIT extends ESIntegTestCase {
+
+ private static final String TEST_INDEX = "test";
+ private static final String INDEX_ALIAS = "alias0";
+ private static final int RELEVANT_RATING_1 = 1;
+
@Override
protected Collection> transportClientPlugins() {
return Arrays.asList(RankEvalPlugin.class);
@@ -55,20 +62,23 @@ protected Collection> nodePlugins() {
@Before
public void setup() {
- createIndex("test");
+ createIndex(TEST_INDEX);
ensureGreen();
- client().prepareIndex("test", "testtype").setId("1")
+ client().prepareIndex(TEST_INDEX, "testtype").setId("1")
.setSource("text", "berlin", "title", "Berlin, Germany", "population", 3670622).get();
- client().prepareIndex("test", "testtype").setId("2").setSource("text", "amsterdam", "population", 851573).get();
- client().prepareIndex("test", "testtype").setId("3").setSource("text", "amsterdam", "population", 851573).get();
- client().prepareIndex("test", "testtype").setId("4").setSource("text", "amsterdam", "population", 851573).get();
- client().prepareIndex("test", "testtype").setId("5").setSource("text", "amsterdam", "population", 851573).get();
- client().prepareIndex("test", "testtype").setId("6").setSource("text", "amsterdam", "population", 851573).get();
+ client().prepareIndex(TEST_INDEX, "testtype").setId("2").setSource("text", "amsterdam", "population", 851573).get();
+ client().prepareIndex(TEST_INDEX, "testtype").setId("3").setSource("text", "amsterdam", "population", 851573).get();
+ client().prepareIndex(TEST_INDEX, "testtype").setId("4").setSource("text", "amsterdam", "population", 851573).get();
+ client().prepareIndex(TEST_INDEX, "testtype").setId("5").setSource("text", "amsterdam", "population", 851573).get();
+ client().prepareIndex(TEST_INDEX, "testtype").setId("6").setSource("text", "amsterdam", "population", 851573).get();
// add another index for testing closed indices etc...
client().prepareIndex("test2", "testtype").setId("7").setSource("text", "amsterdam", "population", 851573).get();
refresh();
+
+ // set up an alias that can also be used in tests
+ assertAcked(client().admin().indices().prepareAliases().addAliasAction(AliasActions.add().index(TEST_INDEX).alias(INDEX_ALIAS)));
}
/**
@@ -98,7 +108,8 @@ public void testPrecisionAtRequest() {
RankEvalAction.INSTANCE, new RankEvalRequest());
builder.setRankEvalSpec(task);
- RankEvalResponse response = client().execute(RankEvalAction.INSTANCE, builder.request().indices("test"))
+ String indexToUse = randomBoolean() ? TEST_INDEX : INDEX_ALIAS;
+ RankEvalResponse response = client().execute(RankEvalAction.INSTANCE, builder.request().indices(indexToUse))
.actionGet();
// the expected Prec@ for the first query is 4/6 and the expected Prec@ for the
// second is 1/6, divided by 2 to get the average
@@ -117,7 +128,7 @@ public void testPrecisionAtRequest() {
if (id.equals("1") || id.equals("6")) {
assertFalse(hit.getRating().isPresent());
} else {
- assertEquals(TestRatingEnum.RELEVANT.ordinal(), hit.getRating().get().intValue());
+ assertEquals(RELEVANT_RATING_1, hit.getRating().get().intValue());
}
}
}
@@ -128,7 +139,7 @@ public void testPrecisionAtRequest() {
for (RatedSearchHit hit : hitsAndRatings) {
String id = hit.getSearchHit().getId();
if (id.equals("1")) {
- assertEquals(TestRatingEnum.RELEVANT.ordinal(), hit.getRating().get().intValue());
+ assertEquals(RELEVANT_RATING_1, hit.getRating().get().intValue());
} else {
assertFalse(hit.getRating().isPresent());
}
@@ -140,7 +151,7 @@ public void testPrecisionAtRequest() {
metric = new PrecisionAtK(1, false, 3);
task = new RankEvalSpec(specifications, metric);
- builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE, new RankEvalRequest(task, new String[] { "test" }));
+ builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE, new RankEvalRequest(task, new String[] { TEST_INDEX }));
response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet();
// if we look only at top 3 documente, the expected P@3 for the first query is
@@ -160,19 +171,19 @@ public void testDCGRequest() {
List specifications = new ArrayList<>();
List ratedDocs = Arrays.asList(
- new RatedDocument("test", "1", 3),
- new RatedDocument("test", "2", 2),
- new RatedDocument("test", "3", 3),
- new RatedDocument("test", "4", 0),
- new RatedDocument("test", "5", 1),
- new RatedDocument("test", "6", 2));
+ new RatedDocument(TEST_INDEX, "1", 3),
+ new RatedDocument(TEST_INDEX, "2", 2),
+ new RatedDocument(TEST_INDEX, "3", 3),
+ new RatedDocument(TEST_INDEX, "4", 0),
+ new RatedDocument(TEST_INDEX, "5", 1),
+ new RatedDocument(TEST_INDEX, "6", 2));
specifications.add(new RatedRequest("amsterdam_query", ratedDocs, testQuery));
DiscountedCumulativeGain metric = new DiscountedCumulativeGain(false, null, 10);
RankEvalSpec task = new RankEvalSpec(specifications, metric);
RankEvalRequestBuilder builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE,
- new RankEvalRequest(task, new String[] { "test" }));
+ new RankEvalRequest(task, new String[] { TEST_INDEX }));
RankEvalResponse response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet();
assertEquals(DiscountedCumulativeGainTests.EXPECTED_DCG, response.getEvaluationResult(), 10E-14);
@@ -181,7 +192,7 @@ public void testDCGRequest() {
metric = new DiscountedCumulativeGain(false, null, 3);
task = new RankEvalSpec(specifications, metric);
- builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE, new RankEvalRequest(task, new String[] { "test" }));
+ builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE, new RankEvalRequest(task, new String[] { TEST_INDEX }));
response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet();
assertEquals(12.39278926071437, response.getEvaluationResult(), 10E-14);
@@ -200,7 +211,7 @@ public void testMRRRequest() {
RankEvalSpec task = new RankEvalSpec(specifications, metric);
RankEvalRequestBuilder builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE,
- new RankEvalRequest(task, new String[] { "test" }));
+ new RankEvalRequest(task, new String[] { TEST_INDEX }));
RankEvalResponse response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet();
// the expected reciprocal rank for the amsterdam_query is 1/5
@@ -213,7 +224,7 @@ public void testMRRRequest() {
metric = new MeanReciprocalRank(1, 3);
task = new RankEvalSpec(specifications, metric);
- builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE, new RankEvalRequest(task, new String[] { "test" }));
+ builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE, new RankEvalRequest(task, new String[] { TEST_INDEX }));
response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet();
// limiting to top 3 results, the amsterdam_query has no relevant document in it
@@ -244,7 +255,7 @@ public void testBadQuery() {
RankEvalSpec task = new RankEvalSpec(specifications, new PrecisionAtK());
RankEvalRequestBuilder builder = new RankEvalRequestBuilder(client(), RankEvalAction.INSTANCE,
- new RankEvalRequest(task, new String[] { "test" }));
+ new RankEvalRequest(task, new String[] { TEST_INDEX }));
builder.setRankEvalSpec(task);
RankEvalResponse response = client().execute(RankEvalAction.INSTANCE, builder.request()).actionGet();
@@ -259,16 +270,16 @@ public void testBadQuery() {
public void testIndicesOptions() {
SearchSourceBuilder amsterdamQuery = new SearchSourceBuilder().query(new MatchAllQueryBuilder());
List relevantDocs = createRelevant("2", "3", "4", "5", "6");
- relevantDocs.add(new RatedDocument("test2", "7", TestRatingEnum.RELEVANT.ordinal()));
+ relevantDocs.add(new RatedDocument("test2", "7", RELEVANT_RATING_1));
List specifications = new ArrayList<>();
specifications.add(new RatedRequest("amsterdam_query", relevantDocs, amsterdamQuery));
RankEvalSpec task = new RankEvalSpec(specifications, new PrecisionAtK());
- RankEvalRequest request = new RankEvalRequest(task, new String[] { "test", "test2" });
+ RankEvalRequest request = new RankEvalRequest(task, new String[] { TEST_INDEX, "test2" });
request.setRankEvalSpec(task);
RankEvalResponse response = client().execute(RankEvalAction.INSTANCE, request).actionGet();
- Breakdown details = (PrecisionAtK.Breakdown) response.getPartialResults().get("amsterdam_query").getMetricDetails();
+ Detail details = (PrecisionAtK.Detail) response.getPartialResults().get("amsterdam_query").getMetricDetails();
assertEquals(7, details.getRetrieved());
assertEquals(6, details.getRelevantRetrieved());
@@ -277,7 +288,7 @@ public void testIndicesOptions() {
request.indicesOptions(IndicesOptions.fromParameters(null, "true", null, SearchRequest.DEFAULT_INDICES_OPTIONS));
response = client().execute(RankEvalAction.INSTANCE, request).actionGet();
- details = (PrecisionAtK.Breakdown) response.getPartialResults().get("amsterdam_query").getMetricDetails();
+ details = (PrecisionAtK.Detail) response.getPartialResults().get("amsterdam_query").getMetricDetails();
assertEquals(6, details.getRetrieved());
assertEquals(5, details.getRelevantRetrieved());
@@ -292,12 +303,12 @@ public void testIndicesOptions() {
request = new RankEvalRequest(task, new String[] { "tes*" });
request.indicesOptions(IndicesOptions.fromParameters("none", null, null, SearchRequest.DEFAULT_INDICES_OPTIONS));
response = client().execute(RankEvalAction.INSTANCE, request).actionGet();
- details = (PrecisionAtK.Breakdown) response.getPartialResults().get("amsterdam_query").getMetricDetails();
+ details = (PrecisionAtK.Detail) response.getPartialResults().get("amsterdam_query").getMetricDetails();
assertEquals(0, details.getRetrieved());
request.indicesOptions(IndicesOptions.fromParameters("open", null, null, SearchRequest.DEFAULT_INDICES_OPTIONS));
response = client().execute(RankEvalAction.INSTANCE, request).actionGet();
- details = (PrecisionAtK.Breakdown) response.getPartialResults().get("amsterdam_query").getMetricDetails();
+ details = (PrecisionAtK.Detail) response.getPartialResults().get("amsterdam_query").getMetricDetails();
assertEquals(6, details.getRetrieved());
assertEquals(5, details.getRelevantRetrieved());
@@ -310,7 +321,7 @@ public void testIndicesOptions() {
request = new RankEvalRequest(task, new String[] { "bad*" });
request.indicesOptions(IndicesOptions.fromParameters(null, null, "true", SearchRequest.DEFAULT_INDICES_OPTIONS));
response = client().execute(RankEvalAction.INSTANCE, request).actionGet();
- details = (PrecisionAtK.Breakdown) response.getPartialResults().get("amsterdam_query").getMetricDetails();
+ details = (PrecisionAtK.Detail) response.getPartialResults().get("amsterdam_query").getMetricDetails();
assertEquals(0, details.getRetrieved());
request.indicesOptions(IndicesOptions.fromParameters(null, null, "false", SearchRequest.DEFAULT_INDICES_OPTIONS));
@@ -322,7 +333,7 @@ public void testIndicesOptions() {
private static List createRelevant(String... docs) {
List relevant = new ArrayList<>();
for (String doc : docs) {
- relevant.add(new RatedDocument("test", doc, TestRatingEnum.RELEVANT.ordinal()));
+ relevant.add(new RatedDocument("test", doc, RELEVANT_RATING_1));
}
return relevant;
}
diff --git a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RankEvalSpecTests.java b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RankEvalSpecTests.java
index b49811a9bcaec..e0899b451af11 100644
--- a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RankEvalSpecTests.java
+++ b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RankEvalSpecTests.java
@@ -52,7 +52,6 @@
import static org.elasticsearch.test.EqualsHashCodeTestUtils.checkEqualsAndHashCode;
import static org.elasticsearch.test.XContentTestUtils.insertRandomFields;
import static org.hamcrest.Matchers.containsString;
-import static org.hamcrest.Matchers.startsWith;
public class RankEvalSpecTests extends ESTestCase {
diff --git a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RatedRequestsTests.java b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RatedRequestsTests.java
index ad962178f581f..196b50b7f6163 100644
--- a/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RatedRequestsTests.java
+++ b/modules/rank-eval/src/test/java/org/elasticsearch/index/rankeval/RatedRequestsTests.java
@@ -19,8 +19,6 @@
package org.elasticsearch.index.rankeval;
-import org.elasticsearch.ExceptionsHelper;
-import org.elasticsearch.common.ParsingException;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
import org.elasticsearch.common.settings.Settings;
@@ -54,7 +52,6 @@
import static org.elasticsearch.test.EqualsHashCodeTestUtils.checkEqualsAndHashCode;
import static org.elasticsearch.test.XContentTestUtils.insertRandomFields;
import static org.hamcrest.Matchers.containsString;
-import static org.hamcrest.Matchers.startsWith;
public class RatedRequestsTests extends ESTestCase {
@@ -139,8 +136,8 @@ public void testXContentParsingIsNotLenient() throws IOException {
Exception exception = expectThrows(Exception.class, () -> RatedRequest.fromXContent(parser));
if (exception instanceof XContentParseException) {
XContentParseException xcpe = (XContentParseException) exception;
- assertThat(ExceptionsHelper.detailedMessage(xcpe), containsString("unknown field"));
- assertThat(ExceptionsHelper.detailedMessage(xcpe), containsString("parser not found"));
+ assertThat(xcpe.getCause().getMessage(), containsString("unknown field"));
+ assertThat(xcpe.getCause().getMessage(), containsString("parser not found"));
}
if (exception instanceof XContentParseException) {
assertThat(exception.getMessage(), containsString("[request] failed to parse field"));
diff --git a/modules/rank-eval/src/test/resources/rest-api-spec/test/rank_eval/10_basic.yml b/modules/rank-eval/src/test/resources/rest-api-spec/test/rank_eval/10_basic.yml
index fcf5f945a06ae..3900b1f32baa7 100644
--- a/modules/rank-eval/src/test/resources/rest-api-spec/test/rank_eval/10_basic.yml
+++ b/modules/rank-eval/src/test/resources/rest-api-spec/test/rank_eval/10_basic.yml
@@ -1,10 +1,4 @@
----
-"Response format":
-
- - skip:
- version: " - 6.2.99"
- reason: response format was updated in 6.3
-
+setup:
- do:
indices.create:
index: foo
@@ -43,8 +37,21 @@
- do:
indices.refresh: {}
+ - do:
+ indices.put_alias:
+ index: foo
+ name: alias
+
+---
+"Response format":
+
+ - skip:
+ version: " - 6.2.99"
+ reason: response format was updated in 6.3
+
- do:
rank_eval:
+ index: foo,
body: {
"requests" : [
{
@@ -84,52 +91,43 @@
- match: { details.berlin_query.hits.0.hit._id: "doc1" }
- match: { details.berlin_query.hits.0.rating: 1}
- match: { details.berlin_query.hits.1.hit._id: "doc4" }
- - is_false: details.berlin_query.hits.1.rating
+ - is_false: details.berlin_query.hits.1.rating
---
-"Mean Reciprocal Rank":
-
- - skip:
- version: " - 6.2.99"
- reason: response format was updated in 6.3
+"Alias resolution":
- do:
- indices.create:
- index: foo
- body:
- settings:
- index:
- number_of_shards: 1
- - do:
- index:
- index: foo
- type: bar
- id: doc1
- body: { "text": "berlin" }
+ rank_eval:
+ index: alias
+ body: {
+ "requests" : [
+ {
+ "id": "amsterdam_query",
+ "request": { "query": { "match" : {"text" : "amsterdam" }}},
+ "ratings": [
+ {"_index": "foo", "_id": "doc1", "rating": 0},
+ {"_index": "foo", "_id": "doc2", "rating": 1},
+ {"_index": "foo", "_id": "doc3", "rating": 1}]
+ },
+ {
+ "id" : "berlin_query",
+ "request": { "query": { "match" : { "text" : "berlin" } }, "size" : 10 },
+ "ratings": [{"_index": "foo", "_id": "doc1", "rating": 1}]
+ }
+ ],
+ "metric" : { "precision": { "ignore_unlabeled" : true }}
+ }
- - do:
- index:
- index: foo
- type: bar
- id: doc2
- body: { "text": "amsterdam" }
+ - match: { quality_level: 1}
+ - match: { details.amsterdam_query.quality_level: 1.0}
+ - match: { details.berlin_query.quality_level: 1.0}
- - do:
- index:
- index: foo
- type: bar
- id: doc3
- body: { "text": "amsterdam" }
-
- - do:
- index:
- index: foo
- type: bar
- id: doc4
- body: { "text": "something about amsterdam and berlin" }
+---
+"Mean Reciprocal Rank":
- - do:
- indices.refresh: {}
+ - skip:
+ version: " - 6.2.99"
+ reason: response format was updated in 6.3
- do:
rank_eval:
diff --git a/modules/reindex/build.gradle b/modules/reindex/build.gradle
index 479fe78cc8071..f34f4cf52e09c 100644
--- a/modules/reindex/build.gradle
+++ b/modules/reindex/build.gradle
@@ -17,6 +17,10 @@
* under the License.
*/
+import org.apache.tools.ant.taskdefs.condition.Os
+
+import static org.elasticsearch.gradle.BuildPlugin.getJavaHome
+
apply plugin: 'elasticsearch.test-with-dependencies'
esplugin {
@@ -60,3 +64,61 @@ thirdPartyAudit.excludes = [
'org.apache.log.Hierarchy',
'org.apache.log.Logger',
]
+
+// Support for testing reindex-from-remote against old Elaticsearch versions
+configurations {
+ oldesFixture
+ es2
+ es1
+ es090
+}
+
+dependencies {
+ oldesFixture project(':test:fixtures:old-elasticsearch')
+ /* Right now we just test against the latest version of each major we expect
+ * reindex-from-remote to work against. We could randomize the versions but
+ * that doesn't seem worth it at this point. */
+ es2 'org.elasticsearch.distribution.zip:elasticsearch:2.4.5@zip'
+ es1 'org.elasticsearch:elasticsearch:1.7.6@zip'
+ es090 'org.elasticsearch:elasticsearch:0.90.13@zip'
+}
+
+if (Os.isFamily(Os.FAMILY_WINDOWS)) {
+ // we can't get the pid files in windows so we skip reindex-from-old
+ integTestRunner.systemProperty "tests.fromOld", "false"
+} else {
+ integTestRunner.systemProperty "tests.fromOld", "true"
+ /* Set up tasks to unzip and run the old versions of ES before running the
+ * integration tests. */
+ for (String version : ['2', '1', '090']) {
+ Task unzip = task("unzipEs${version}", type: Sync) {
+ Configuration oldEsDependency = configurations['es' + version]
+ dependsOn oldEsDependency
+ /* Use a closure here to delay resolution of the dependency until we need
+ * it */
+ from {
+ oldEsDependency.collect { zipTree(it) }
+ }
+ into temporaryDir
+ }
+ Task fixture = task("oldEs${version}Fixture",
+ type: org.elasticsearch.gradle.test.AntFixture) {
+ dependsOn project.configurations.oldesFixture
+ dependsOn unzip
+ executable = new File(project.runtimeJavaHome, 'bin/java')
+ env 'CLASSPATH', "${ -> project.configurations.oldesFixture.asPath }"
+ env 'JAVA_HOME', getJavaHome(it, 7)
+ args 'oldes.OldElasticsearch',
+ baseDir,
+ unzip.temporaryDir,
+ version == '090'
+ }
+ integTest.dependsOn fixture
+ integTestRunner {
+ /* Use a closure on the string to delay evaluation until right before we
+ * run the integration tests so that we can be sure that the file is
+ * ready. */
+ systemProperty "es${version}.port", "${ -> fixture.addressAndPort }"
+ }
+ }
+}
diff --git a/modules/reindex/src/test/java/org/elasticsearch/index/reindex/RetryTests.java b/modules/reindex/src/test/java/org/elasticsearch/index/reindex/RetryTests.java
index da0dbf2aae345..131c959af8afc 100644
--- a/modules/reindex/src/test/java/org/elasticsearch/index/reindex/RetryTests.java
+++ b/modules/reindex/src/test/java/org/elasticsearch/index/reindex/RetryTests.java
@@ -158,10 +158,10 @@ private void testCase(
final Settings nodeSettings = Settings.builder()
// use pools of size 1 so we can block them
- .put("thread_pool.bulk.size", 1)
+ .put("thread_pool.write.size", 1)
.put("thread_pool.search.size", 1)
// use queues of size 1 because size 0 is broken and because search requests need the queue to function
- .put("thread_pool.bulk.queue_size", 1)
+ .put("thread_pool.write.queue_size", 1)
.put("thread_pool.search.queue_size", 1)
.put("node.attr.color", "blue")
.build();
@@ -203,7 +203,7 @@ private void testCase(
assertBusy(() -> assertThat(taskStatus(action).getSearchRetries(), greaterThan(0L)));
logger.info("Blocking bulk and unblocking search so we start to get bulk rejections");
- CyclicBarrier bulkBlock = blockExecutor(ThreadPool.Names.BULK, node);
+ CyclicBarrier bulkBlock = blockExecutor(ThreadPool.Names.WRITE, node);
initialSearchBlock.await();
logger.info("Waiting for bulk rejections");
diff --git a/qa/reindex-from-old/src/test/java/org/elasticsearch/smoketest/ReindexFromOldRemoteIT.java b/modules/reindex/src/test/java/org/elasticsearch/index/reindex/remote/ReindexFromOldRemoteIT.java
similarity index 95%
rename from qa/reindex-from-old/src/test/java/org/elasticsearch/smoketest/ReindexFromOldRemoteIT.java
rename to modules/reindex/src/test/java/org/elasticsearch/index/reindex/remote/ReindexFromOldRemoteIT.java
index 459aff3439710..5d359053a6668 100644
--- a/qa/reindex-from-old/src/test/java/org/elasticsearch/smoketest/ReindexFromOldRemoteIT.java
+++ b/modules/reindex/src/test/java/org/elasticsearch/index/reindex/remote/ReindexFromOldRemoteIT.java
@@ -17,7 +17,7 @@
* under the License.
*/
-package org.elasticsearch.smoketest;
+package org.elasticsearch.index.reindex.remote;
import org.apache.http.HttpEntity;
import org.apache.http.HttpHost;
@@ -27,6 +27,7 @@
import org.elasticsearch.client.Response;
import org.elasticsearch.client.ResponseException;
import org.elasticsearch.client.RestClient;
+import org.elasticsearch.common.Booleans;
import org.elasticsearch.test.rest.ESRestTestCase;
import java.io.IOException;
@@ -38,6 +39,9 @@
public class ReindexFromOldRemoteIT extends ESRestTestCase {
private void oldEsTestCase(String portPropertyName, String requestsPerSecond) throws IOException {
+ boolean enabled = Booleans.parseBoolean(System.getProperty("tests.fromOld"));
+ assumeTrue("test is disabled, probably because this is windows", enabled);
+
int oldEsPort = Integer.parseInt(System.getProperty(portPropertyName));
try (RestClient oldEs = RestClient.builder(new HttpHost("127.0.0.1", oldEsPort)).build()) {
try {
diff --git a/plugins/build.gradle b/plugins/build.gradle
index 27655abf534f6..cf942148dea94 100644
--- a/plugins/build.gradle
+++ b/plugins/build.gradle
@@ -27,7 +27,7 @@ configure(subprojects.findAll { it.parent.path == project.path }) {
// for local ES plugins, the name of the plugin is the same as the directory
name project.name
- licenseFile rootProject.file('LICENSE.txt')
+ licenseFile rootProject.file('licenses/APACHE-LICENSE-2.0.txt')
noticeFile rootProject.file('NOTICE.txt')
}
}
diff --git a/plugins/discovery-file/build.gradle b/plugins/discovery-file/build.gradle
index 145d959fa4100..529b8cbef304d 100644
--- a/plugins/discovery-file/build.gradle
+++ b/plugins/discovery-file/build.gradle
@@ -38,6 +38,7 @@ task setupSeedNodeAndUnicastHostsFile(type: DefaultTask) {
// setup the initial cluster with one node that will serve as the seed node
// for unicast discovery
ClusterConfiguration config = new ClusterConfiguration(project)
+config.distribution = 'integ-test-zip'
config.clusterName = 'discovery-file-test-cluster'
List nodes = ClusterFormationTasks.setup(project, 'initialCluster', setupSeedNodeAndUnicastHostsFile, config)
File srcUnicastHostsFile = file('build/cluster/unicast_hosts.txt')
diff --git a/plugins/repository-hdfs/build.gradle b/plugins/repository-hdfs/build.gradle
index 631157a7e175b..8231e15af200c 100644
--- a/plugins/repository-hdfs/build.gradle
+++ b/plugins/repository-hdfs/build.gradle
@@ -153,6 +153,7 @@ for (String fixtureName : ['hdfsFixture', 'haHdfsFixture', 'secureHdfsFixture',
project.afterEvaluate {
for (String integTestTaskName : ['integTestHa', 'integTestSecure', 'integTestSecureHa']) {
ClusterConfiguration cluster = project.extensions.getByName("${integTestTaskName}Cluster") as ClusterConfiguration
+ cluster.distribution = 'integ-test-zip'
cluster.dependsOn(project.bundlePlugin)
Task restIntegTestTask = project.tasks.getByName(integTestTaskName)
diff --git a/plugins/repository-s3/build.gradle b/plugins/repository-s3/build.gradle
index 46988a2dd5107..23252881cd75f 100644
--- a/plugins/repository-s3/build.gradle
+++ b/plugins/repository-s3/build.gradle
@@ -1,5 +1,3 @@
-import org.elasticsearch.gradle.test.AntFixture
-
/*
* Licensed to Elasticsearch under one or more contributor
* license agreements. See the NOTICE file distributed with
@@ -66,28 +64,14 @@ test {
exclude '**/*CredentialsTests.class'
}
-forbiddenApisTest {
- // we are using jdk-internal instead of jdk-non-portable to allow for com.sun.net.httpserver.* usage
- bundledSignatures -= 'jdk-non-portable'
- bundledSignatures += 'jdk-internal'
-}
-
-/** A task to start the AmazonS3Fixture which emulates a S3 service **/
-task s3Fixture(type: AntFixture) {
- dependsOn compileTestJava
- env 'CLASSPATH', "${ -> project.sourceSets.test.runtimeClasspath.asPath }"
- executable = new File(project.runtimeJavaHome, 'bin/java')
- args 'org.elasticsearch.repositories.s3.AmazonS3Fixture', baseDir, 'bucket_test'
+check {
+ // also execute the QA tests when testing the plugin
+ dependsOn 'qa:amazon-s3:check'
}
integTestCluster {
- dependsOn s3Fixture
-
keystoreSetting 's3.client.integration_test.access_key', "s3_integration_test_access_key"
keystoreSetting 's3.client.integration_test.secret_key', "s3_integration_test_secret_key"
-
- /* Use a closure on the string to delay evaluation until tests are executed */
- setting 's3.client.integration_test.endpoint', "http://${ -> s3Fixture.addressAndPort }"
}
thirdPartyAudit.excludes = [
diff --git a/plugins/repository-s3/qa/amazon-s3/build.gradle b/plugins/repository-s3/qa/amazon-s3/build.gradle
new file mode 100644
index 0000000000000..5e288899021a1
--- /dev/null
+++ b/plugins/repository-s3/qa/amazon-s3/build.gradle
@@ -0,0 +1,83 @@
+/*
+ * Licensed to Elasticsearch under one or more contributor
+ * license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright
+ * ownership. Elasticsearch licenses this file to you under
+ * the Apache License, Version 2.0 (the "License"); you may
+ * not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+import org.elasticsearch.gradle.MavenFilteringHack
+import org.elasticsearch.gradle.test.AntFixture
+
+apply plugin: 'elasticsearch.standalone-rest-test'
+apply plugin: 'elasticsearch.rest-test'
+
+dependencies {
+ testCompile project(path: ':plugins:repository-s3', configuration: 'runtime')
+}
+
+integTestCluster {
+ plugin ':plugins:repository-s3'
+}
+
+forbiddenApisTest {
+ // we are using jdk-internal instead of jdk-non-portable to allow for com.sun.net.httpserver.* usage
+ bundledSignatures -= 'jdk-non-portable'
+ bundledSignatures += 'jdk-internal'
+}
+
+boolean useFixture = false
+
+String s3AccessKey = System.getenv("amazon_s3_access_key")
+String s3SecretKey = System.getenv("amazon_s3_secret_key")
+String s3Bucket = System.getenv("amazon_s3_bucket")
+String s3BasePath = System.getenv("amazon_s3_base_path")
+
+if (!s3AccessKey && !s3SecretKey && !s3Bucket && !s3BasePath) {
+ s3AccessKey = 's3_integration_test_access_key'
+ s3SecretKey = 's3_integration_test_secret_key'
+ s3Bucket = 'bucket_test'
+ s3BasePath = 'integration_test'
+ useFixture = true
+}
+
+/** A task to start the AmazonS3Fixture which emulates a S3 service **/
+task s3Fixture(type: AntFixture) {
+ dependsOn compileTestJava
+ env 'CLASSPATH', "${ -> project.sourceSets.test.runtimeClasspath.asPath }"
+ executable = new File(project.runtimeJavaHome, 'bin/java')
+ args 'org.elasticsearch.repositories.s3.AmazonS3Fixture', baseDir, s3Bucket
+}
+
+Map expansions = [
+ 'bucket': s3Bucket,
+ 'base_path': s3BasePath
+]
+processTestResources {
+ inputs.properties(expansions)
+ MavenFilteringHack.filter(it, expansions)
+}
+
+integTestCluster {
+ keystoreSetting 's3.client.integration_test.access_key', s3AccessKey
+ keystoreSetting 's3.client.integration_test.secret_key', s3SecretKey
+
+ if (useFixture) {
+ dependsOn s3Fixture
+ /* Use a closure on the string to delay evaluation until tests are executed */
+ setting 's3.client.integration_test.endpoint', "http://${-> s3Fixture.addressAndPort}"
+ } else {
+ println "Using an external service to test the repository-s3 plugin"
+ }
+}
\ No newline at end of file
diff --git a/plugins/repository-s3/src/test/java/org/elasticsearch/repositories/s3/AmazonS3Fixture.java b/plugins/repository-s3/qa/amazon-s3/src/test/java/org/elasticsearch/repositories/s3/AmazonS3Fixture.java
similarity index 100%
rename from plugins/repository-s3/src/test/java/org/elasticsearch/repositories/s3/AmazonS3Fixture.java
rename to plugins/repository-s3/qa/amazon-s3/src/test/java/org/elasticsearch/repositories/s3/AmazonS3Fixture.java
diff --git a/plugins/repository-s3/qa/amazon-s3/src/test/java/org/elasticsearch/repositories/s3/AmazonS3RepositoryClientYamlTestSuiteIT.java b/plugins/repository-s3/qa/amazon-s3/src/test/java/org/elasticsearch/repositories/s3/AmazonS3RepositoryClientYamlTestSuiteIT.java
new file mode 100644
index 0000000000000..afcc0fa353482
--- /dev/null
+++ b/plugins/repository-s3/qa/amazon-s3/src/test/java/org/elasticsearch/repositories/s3/AmazonS3RepositoryClientYamlTestSuiteIT.java
@@ -0,0 +1,37 @@
+/*
+ * Licensed to Elasticsearch under one or more contributor
+ * license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright
+ * ownership. Elasticsearch licenses this file to you under
+ * the Apache License, Version 2.0 (the "License"); you may
+ * not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.elasticsearch.repositories.s3;
+
+import com.carrotsearch.randomizedtesting.annotations.Name;
+import com.carrotsearch.randomizedtesting.annotations.ParametersFactory;
+import org.elasticsearch.test.rest.yaml.ClientYamlTestCandidate;
+import org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase;
+
+public class AmazonS3RepositoryClientYamlTestSuiteIT extends ESClientYamlSuiteTestCase {
+
+ public AmazonS3RepositoryClientYamlTestSuiteIT(@Name("yaml") ClientYamlTestCandidate testCandidate) {
+ super(testCandidate);
+ }
+
+ @ParametersFactory
+ public static Iterable