Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/master' into reindex_v2
Browse files Browse the repository at this point in the history
  • Loading branch information
henningandersen committed Feb 19, 2020
2 parents 83dc008 + 0c309ef commit 9fd791a
Show file tree
Hide file tree
Showing 89 changed files with 1,848 additions and 1,261 deletions.
2 changes: 1 addition & 1 deletion buildSrc/src/main/resources/minimumGradleVersion
Original file line number Diff line number Diff line change
@@ -1 +1 @@
6.1.1
6.2
2 changes: 1 addition & 1 deletion buildSrc/version.properties
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ joda = 2.10.4
# - plugins/ingest-attachment (transitive dependency, check the upstream POM)
# - distribution/tools/plugin-cli
# - x-pack/plugin/security
bouncycastle = 1.61
bouncycastle=1.64
# test dependencies
randomizedrunner = 2.7.6
junit = 4.12
Expand Down
2 changes: 1 addition & 1 deletion distribution/packages/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ import java.util.regex.Pattern
*/

plugins {
id "nebula.ospackage-base" version "8.0.3"
id "nebula.ospackage-base" version "8.1.0"
}

void addProcessFilesTask(String type, boolean oss, boolean jdk) {
Expand Down
10 changes: 8 additions & 2 deletions docs/reference/ml/df-analytics/apis/put-dfanalytics.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,10 @@ include::{docdir}/ml/ml-shared.asciidoc[tag=randomize-seed]

`analysis`.`classification`.`num_top_feature_importance_values`::::
(Optional, integer)
include::{docdir}/ml/ml-shared.asciidoc[tag=num-top-feature-importance-values]
Advanced configuration option. Specifies the maximum number of
{ml-docs}/dfa-classification.html#dfa-classification-feature-importance[feature
importance] values per document to return. By default, it is zero and no feature importance
calculation occurs.

`analysis`.`classification`.`training_percent`::::
(Optional, integer)
Expand Down Expand Up @@ -233,7 +236,10 @@ include::{docdir}/ml/ml-shared.asciidoc[tag=prediction-field-name]

`analysis`.`regression`.`num_top_feature_importance_values`::::
(Optional, integer)
include::{docdir}/ml/ml-shared.asciidoc[tag=num-top-feature-importance-values]
Advanced configuration option. Specifies the maximum number of
{ml-docs}/dfa-regression.html#dfa-regression-feature-importance[feature importance]
values per document to return. By default, it is zero and no feature importance calculation
occurs.

`analysis`.`regression`.`training_percent`::::
(Optional, integer)
Expand Down
7 changes: 0 additions & 7 deletions docs/reference/ml/ml-shared.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -906,13 +906,6 @@ total number of categories (in the {version} version of the {stack}, it's two)
to predict then we will report all category probabilities. Defaults to 2.
end::num-top-classes[]

tag::num-top-feature-importance-values[]
Advanced configuration option. If set, feature importance for the top
most important features will be computed. Importance is calculated
using the SHAP (SHapley Additive exPlanations) method as described in
https://papers.nips.cc/paper/7062-a-unified-approach-to-interpreting-model-predictions.pdf[Lundberg, S. M., & Lee, S.-I. A Unified Approach to Interpreting Model Predictions. In NeurIPS 2017.].
end::num-top-feature-importance-values[]

tag::over-field-name[]
The field used to split the data. In particular, this property is used for
analyzing the splits with respect to the history of all splits. It is used for
Expand Down
7 changes: 2 additions & 5 deletions docs/reference/sql/endpoints/client-apps/dbeaver.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,13 @@
[[sql-client-apps-dbeaver]]
=== DBeaver

[quote, https://dbeaver.io/]
____
https://dbeaver.io/[DBeaver] DBeaver is free and open source universal database tool for developers and database administrators.
____
You can use the {es} JDBC driver to access {es} data from DBeaver.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* DBeaver version 6.0.0 or higher
* https://dbeaver.io/[DBeaver] version 6.0.0 or higher
* {es-sql} <<sql-jdbc, JDBC driver>>

==== New Connection
Expand Down
7 changes: 2 additions & 5 deletions docs/reference/sql/endpoints/client-apps/dbvis.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,13 @@
[[sql-client-apps-dbvis]]
=== DbVisualizer

[quote, http://www.dbvis.com/]
____
https://www.dbvis.com/[DbVisualizer] is a database management and analysis tool for all major databases.
____
You can use the {es} JDBC driver to access {es} data from DbVisualizer.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* DbVisualizer 10.0.21 or higher
* https://www.dbvis.com/[DbVisualizer] 10.0.21 or higher
* {es-sql} <<sql-jdbc, JDBC driver>>

==== Add {es} JDBC driver
Expand Down
8 changes: 2 additions & 6 deletions docs/reference/sql/endpoints/client-apps/excel.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,17 +3,13 @@
[[sql-client-apps-excel]]
=== Microsoft Excel

[quote, https://www.techopedia.com/definition/5430/microsoft-excel]
____
https://products.office.com/en/excel[Microsoft Excel] is a software program [...] that allows users to organize, format and calculate data
with formulas using a spreadsheet system.
____
You can use the {es} ODBC driver to access {es} data from Microsoft Excel.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* Microsoft Office 2016 or higher
* https://products.office.com/en/excel[Microsoft Office] 2016 or higher
* {es-sql} <<sql-odbc, ODBC driver>>
* A preconfigured User or System DSN (see <<dsn-configuration,Configuration>> section on how to configure a DSN).

Expand Down
8 changes: 2 additions & 6 deletions docs/reference/sql/endpoints/client-apps/microstrat.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,17 +3,13 @@
[[sql-client-apps-microstrat]]
=== MicroStrategy Desktop

[quote, https://www.microstrategy.com/us/resources/library/videos/new-microstrategy-desktop]
____
https://www.microstrategy.com/us/get-started/desktop[MicroStrategy Desktop] is a free data discovery tool that helps people bring data to
life using powerful self-service analytics.
____
You can use the {es} ODBC driver to access {es} data from MicroStrategy Desktop.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* MicroStrategy Desktop 11 or higher
* https://www.microstrategy.com/us/get-started/desktop[MicroStrategy Desktop] 11 or higher
* {es-sql} <<sql-odbc, ODBC driver>>
* A preconfigured User or System DSN (see <<dsn-configuration,Configuration>> section on how to configure a DSN).

Expand Down
8 changes: 2 additions & 6 deletions docs/reference/sql/endpoints/client-apps/powerbi.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,17 +3,13 @@
[[sql-client-apps-powerbi]]
=== Microsoft Power BI Desktop

[quote, https://powerbi.microsoft.com/en-us/what-is-power-bi/]
____
https://powerbi.microsoft.com/en-us/desktop/[Power BI] is a business analytics solution that lets you visualize your data and share
insights across your organization, or embed them in your app or website.
____
You can use the {es} ODBC driver to access {es} data from Microsoft Power BI Desktop.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* Microsoft Power BI Desktop 2.63 or higher
* https://powerbi.microsoft.com/en-us/desktop/[Microsoft Power BI Desktop] 2.63 or higher
* {es-sql} <<sql-odbc, ODBC driver>>
* A preconfigured User or System DSN (see <<dsn-configuration,Configuration>> section on how to configure a DSN).

Expand Down
9 changes: 2 additions & 7 deletions docs/reference/sql/endpoints/client-apps/ps1.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,18 +3,13 @@
[[sql-client-apps-ps1]]
=== Microsoft PowerShell

[quote, https://docs.microsoft.com/en-us/powershell/scripting/powershell-scripting]
____
https://docs.microsoft.com/en-us/powershell/[PowerShell] is a task-based command-line shell and scripting language built on .NET.
____

PowerShell is available on all recent Windows Desktop OSes. It also has embedded ODBC support, thus offering a quick and accessible way to connect to {es}.
You can use the {es} ODBC driver to access {es} data from Microsoft PowerShell.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* Microsoft PowerShell
* https://docs.microsoft.com/en-us/powershell/[Microsoft PowerShell]
* {es-sql} <<sql-odbc, ODBC driver>>
* A preconfigured User or System DSN (see <<dsn-configuration,Configuration>> section on how to configure a DSN).

Expand Down
9 changes: 2 additions & 7 deletions docs/reference/sql/endpoints/client-apps/qlik.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,18 +3,13 @@
[[sql-client-apps-qlik]]
=== Qlik Sense Desktop

[quote, https://help.qlik.com/en-US/sense/February2018/Subsystems/Hub/Content/Introduction/at-a-glance.htm]
____
https://www.qlik.com/us/try-or-buy/download-qlik-sense[Qlik Sense Desktop] is a Windows application that gives individuals the opportunity
to use Qlik Sense and create personalized, interactive data visualizations, reports, and dashboards from multiple data sources with
drag-and-drop ease.
____
You can use the {es} ODBC driver to access {es} data from Qlik Sense Desktop.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* Qlik Sense Desktop November 2018 or higher
* https://www.qlik.com/us/try-or-buy/download-qlik-sense[Qlik Sense Desktop] November 2018 or higher
* {es-sql} <<sql-odbc, ODBC driver>>
* A preconfigured User or System DSN (see <<dsn-configuration,Configuration>> section on how to configure a DSN).

Expand Down
7 changes: 2 additions & 5 deletions docs/reference/sql/endpoints/client-apps/squirrel.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,13 @@
[[sql-client-apps-squirrel]]
=== SQuirreL SQL

[quote, http://squirrel-sql.sourceforge.net/]
____
http://squirrel-sql.sourceforge.net/[SQuirreL SQL] is a graphical, [multi-platform] Java program that will allow you to view the structure of a JDBC compliant database [...].
____
You can use the {es} JDBC driver to access {es} data from SQuirreL SQL.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* SQuirreL SQL version 4.0.0 or higher
* http://squirrel-sql.sourceforge.net/[SQuirreL SQL] version 4.0.0 or higher
* {es-sql} <<sql-jdbc, JDBC driver>>

==== Add {es} JDBC Driver
Expand Down
8 changes: 2 additions & 6 deletions docs/reference/sql/endpoints/client-apps/tableau.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,17 +3,13 @@
[[sql-client-apps-tableau]]
=== Tableau Desktop

[quote, https://www.tableau.com/products/what-is-tableau]
____
https://www.tableau.com/products/desktop[Tableau] is the most powerful, secure, and flexible end-to-end analytics platform
for your data.
____
You can use the {es} ODBC driver to access {es} data from Tableau Desktop.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* Tableau 2018 or higher
* https://www.tableau.com/products/desktop[Tableau Desktop] 2018 or higher
* {es-sql} <<sql-odbc, ODBC driver>>
* A preconfigured User or System DSN (see <<dsn-configuration,Configuration>> section on how to configure a DSN).

Expand Down
7 changes: 2 additions & 5 deletions docs/reference/sql/endpoints/client-apps/workbench.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,13 @@
[[sql-client-apps-workbench]]
=== SQL Workbench/J

[quote, https://www.sql-workbench.eu/]
____
https://www.sql-workbench.eu/[SQL Workbench/J] is a free, DBMS-independent, cross-platform SQL query tool.
____
You can use the {es} JDBC driver to access {es} data from SQL Workbench/J.

IMPORTANT: Elastic does not endorse, promote or provide support for this application; for native Elasticsearch integration in this product, please reach out to its vendor.

==== Prerequisites

* SQL Workbench/J build 125 or higher
* https://www.sql-workbench.eu/[SQL Workbench/J] build 125 or higher
* {es-sql} <<sql-jdbc, JDBC driver>>

==== Add {es} JDBC driver
Expand Down
11 changes: 3 additions & 8 deletions gradle/build-scan.gradle
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import org.elasticsearch.gradle.OS
import org.elasticsearch.gradle.info.BuildParams
import org.gradle.initialization.BuildRequestMetaData

import java.util.concurrent.TimeUnit
Expand Down Expand Up @@ -76,14 +77,8 @@ buildScan {
value 'Git Branch', branch
tag branch
}
if (System.getenv('GIT_COMMIT')) {
value 'Git Commit ID', System.getenv('GIT_COMMIT')
link 'Source', "https://github.com/elastic/elasticsearch/tree/${System.getenv('GIT_COMMIT')}"
background {
def changes = "git diff --name-only ${System.getenv('GIT_PREVIOUS_COMMIT')}..${System.getenv('GIT_COMMIT')}".execute().text.trim()
value 'Git Changes', changes
}
}
value 'Git Commit ID', BuildParams.gitRevision
link 'Source', "https://github.com/elastic/elasticsearch/tree/${BuildParams.gitRevision}"
}
} else {
tag 'LOCAL'
Expand Down
4 changes: 2 additions & 2 deletions gradle/wrapper/gradle-wrapper.properties
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-6.1.1-all.zip
distributionUrl=https\://services.gradle.org/distributions/gradle-6.2-all.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionSha256Sum=10065868c78f1207afb3a92176f99a37d753a513dff453abb6b5cceda4058cda
distributionSha256Sum=f016e66d88c2f9adb5b6e7dff43a363b8c2632f18b4ad6f365f49da34dd57db8
3 changes: 3 additions & 0 deletions gradlew.bat
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,9 @@ if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%

@rem Resolve any "." and ".." in APP_HOME to make it shorter.
for %%i in ("%APP_HOME%") do set APP_HOME=%%~fi

@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS="-Xmx64m" "-Xms64m"

Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
7a2601f0a1d336966cca03edb04a69ba0f5f25d9

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
3dac163e20110817d850d17e0444852a6d7d0bd7

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
1467dac1b787b5ad2a18201c0c281df69882259e
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@
import org.elasticsearch.common.UUIDs;
import org.elasticsearch.common.blobstore.BlobContainer;
import org.elasticsearch.common.blobstore.BlobPath;
import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.bytes.BytesReference;
import org.elasticsearch.common.collect.Tuple;
import org.elasticsearch.common.io.Streams;
Expand All @@ -52,7 +53,6 @@
import org.junit.Before;
import org.threeten.bp.Duration;

import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.net.InetAddress;
Expand Down Expand Up @@ -323,14 +323,16 @@ public void testWriteLargeBlob() throws IOException {
logger.debug("starting with resumable upload id [{}]", sessionUploadId.get());

httpServer.createContext("/upload/storage/v1/b/bucket/o", safeHandler(exchange -> {
final BytesReference requestBody = Streams.readFully(exchange.getRequestBody());

final Map<String, String> params = new HashMap<>();
RestUtils.decodeQueryString(exchange.getRequestURI().getQuery(), 0, params);
assertThat(params.get("uploadType"), equalTo("resumable"));

if ("POST".equals(exchange.getRequestMethod())) {
assertThat(params.get("name"), equalTo("write_large_blob"));
if (countInits.decrementAndGet() <= 0) {
byte[] response = Streams.readFully(exchange.getRequestBody()).utf8ToString().getBytes(UTF_8);
byte[] response = requestBody.utf8ToString().getBytes(UTF_8);
exchange.getResponseHeaders().add("Content-Type", "application/json");
exchange.getResponseHeaders().add("Location", httpServerUrl() +
"/upload/storage/v1/b/bucket/o?uploadType=resumable&upload_id=" + sessionUploadId.get());
Expand All @@ -348,7 +350,6 @@ public void testWriteLargeBlob() throws IOException {
if (uploadId.equals(sessionUploadId.get()) == false) {
logger.debug("session id [{}] is gone", uploadId);
assertThat(wrongChunk, greaterThan(0));
Streams.readFully(exchange.getRequestBody());
exchange.sendResponseHeaders(HttpStatus.SC_GONE, -1);
return;
}
Expand All @@ -367,7 +368,6 @@ public void testWriteLargeBlob() throws IOException {
countInits.set(nbErrors);
countUploads.set(nbErrors * totalChunks);

Streams.readFully(exchange.getRequestBody());
exchange.sendResponseHeaders(HttpStatus.SC_GONE, -1);
return;
}
Expand All @@ -377,14 +377,12 @@ public void testWriteLargeBlob() throws IOException {
assertTrue(Strings.hasLength(range));

if (countUploads.decrementAndGet() % 2 == 0) {
final ByteArrayOutputStream requestBody = new ByteArrayOutputStream();
final long bytesRead = Streams.copy(exchange.getRequestBody(), requestBody);
assertThat(Math.toIntExact(bytesRead), anyOf(equalTo(defaultChunkSize), equalTo(lastChunkSize)));
assertThat(Math.toIntExact(requestBody.length()), anyOf(equalTo(defaultChunkSize), equalTo(lastChunkSize)));

final int rangeStart = getContentRangeStart(range);
final int rangeEnd = getContentRangeEnd(range);
assertThat(rangeEnd + 1 - rangeStart, equalTo(Math.toIntExact(bytesRead)));
assertArrayEquals(Arrays.copyOfRange(data, rangeStart, rangeEnd + 1), requestBody.toByteArray());
assertThat(rangeEnd + 1 - rangeStart, equalTo(Math.toIntExact(requestBody.length())));
assertThat(new BytesArray(data, rangeStart, rangeEnd - rangeStart + 1), is(requestBody));

final Integer limit = getContentRangeLimit(range);
if (limit != null) {
Expand All @@ -399,8 +397,6 @@ public void testWriteLargeBlob() throws IOException {
}
}

// read all the request body, otherwise the SDK client throws a non-retryable StorageException
Streams.readFully(exchange.getRequestBody());
if (randomBoolean()) {
exchange.sendResponseHeaders(HttpStatus.SC_INTERNAL_SERVER_ERROR, -1);
}
Expand Down
Loading

0 comments on commit 9fd791a

Please sign in to comment.