Skip to content

Commit

Permalink
Merge branch 'master' into retention-leases-version
Browse files Browse the repository at this point in the history
* master:
  Replace awaitBusy with assertBusy in atLeastDocsIndexed (elastic#38190)
  Adjust SearchRequest version checks (elastic#38181)
  AwaitsFix testClientSucceedsWithVerificationDisabled (elastic#38213)
  Zen2ify RareClusterStateIT (elastic#38184)
  ML: Fix error race condition on stop _all datafeeds and close _all jobs (elastic#38113)
  AwaitsFix PUT mapping with _doc on an index that has types (elastic#38204)
  Allow built-in monitoring_user role to call GET _xpack API (elastic#38060)
  Update geo_shape docs to include unsupported features (elastic#38138)
  [ML] Remove "8" prefixes from file structure finder timestamp formats (elastic#38016)
  Disable bwc tests while backporting elastic#38104 (elastic#38182)
  Enable TLSv1.3 by default for JDKs with support (elastic#38103)
  Fix _host based require filters (elastic#38173)
  RestoreService should update primary terms when restoring shards of existing indices (elastic#38177)
  Throw if two inner_hits have the same name (elastic#37645)
  • Loading branch information
jasontedor committed Feb 1, 2019
2 parents 7cf145c + f64b203 commit 686d35e
Show file tree
Hide file tree
Showing 38 changed files with 422 additions and 225 deletions.
2 changes: 1 addition & 1 deletion build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,7 @@ task verifyVersions {
* after the backport of the backcompat code is complete.
*/
final boolean bwc_tests_enabled = false
final String bwc_tests_disabled_issue = "https://github.com/elastic/elasticsearch/pull/37951" /* place a PR link here when committing bwc changes */
final String bwc_tests_disabled_issue = "https://github.com/elastic/elasticsearch/pull/37951,https://github.com/elastic/elasticsearch/pull/38180" /* place a PR link here when committing bwc changes */
if (bwc_tests_enabled == false) {
if (bwc_tests_disabled_issue.isEmpty()) {
throw new GradleException("bwc_tests_disabled_issue must be set when bwc_tests_enabled == false")
Expand Down
16 changes: 15 additions & 1 deletion docs/reference/mapping/types/geo-shape.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ type.
|=======================================================================
|Option |Description| Default

|`tree |deprecated[6.6, PrefixTrees no longer used] Name of the PrefixTree
|`tree` |deprecated[6.6, PrefixTrees no longer used] Name of the PrefixTree
implementation to be used: `geohash` for GeohashPrefixTree and `quadtree`
for QuadPrefixTree. Note: This parameter is only relevant for `term` and
`recursive` strategies.
Expand Down Expand Up @@ -127,6 +127,20 @@ the `tree` or `strategy` parameters according to the appropriate
<<geo-shape-mapping-options>>. Note that these parameters are now deprecated
and will be removed in a future version.

*IMPORTANT NOTES*

The following features are not yet supported with the new indexing approach:

* `geo_shape` query with `MultiPoint` geometry types - Elasticsearch currently prevents searching
geo_shape fields with a MultiPoint geometry type to avoid a brute force linear search
over each individual point. For now, if this is absolutely needed, this can be achieved
using a `bool` query with each individual point.

* `CONTAINS` relation query - when using the new default vector indexing strategy, `geo_shape`
queries with `relation` defined as `contains` are not yet supported. If this query relation
is an absolute necessity, it is recommended to set `strategy` to `quadtree` and use the
deprecated PrefixTree strategy indexing approach.

[[prefix-trees]]
[float]
==== Prefix trees
Expand Down
4 changes: 2 additions & 2 deletions docs/reference/migration/migrate_7_0/settings.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -138,11 +138,11 @@ used.

TLS version 1.0 is now disabled by default as it suffers from
https://www.owasp.org/index.php/Transport_Layer_Protection_Cheat_Sheet#Rule_-_Only_Support_Strong_Protocols[known security issues].
The default protocols are now TLSv1.2 and TLSv1.1.
The default protocols are now TLSv1.3 (if supported), TLSv1.2 and TLSv1.1.
You can enable TLS v1.0 by configuring the relevant `ssl.supported_protocols` setting to include `"TLSv1"`, for example:
[source,yaml]
--------------------------------------------------
xpack.security.http.ssl.supported_protocols: [ "TLSv1.2", "TLSv1.1", "TLSv1" ]
xpack.security.http.ssl.supported_protocols: [ "TLSv1.3", "TLSv1.2", "TLSv1.1", "TLSv1" ]
--------------------------------------------------

[float]
Expand Down
10 changes: 5 additions & 5 deletions docs/reference/ml/apis/find-file-structure.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -606,11 +606,11 @@ If the request does not encounter errors, you receive the following result:
},
"tpep_dropoff_datetime" : {
"type" : "date",
"format" : "8yyyy-MM-dd HH:mm:ss"
"format" : "yyyy-MM-dd HH:mm:ss"
},
"tpep_pickup_datetime" : {
"type" : "date",
"format" : "8yyyy-MM-dd HH:mm:ss"
"format" : "yyyy-MM-dd HH:mm:ss"
},
"trip_distance" : {
"type" : "double"
Expand All @@ -624,7 +624,7 @@ If the request does not encounter errors, you receive the following result:
"field" : "tpep_pickup_datetime",
"timezone" : "{{ beat.timezone }}",
"formats" : [
"8yyyy-MM-dd HH:mm:ss"
"yyyy-MM-dd HH:mm:ss"
]
}
}
Expand Down Expand Up @@ -1398,7 +1398,7 @@ this:
"field" : "timestamp",
"timezone" : "{{ beat.timezone }}",
"formats" : [
"8yyyy-MM-dd'T'HH:mm:ss,SSS"
"yyyy-MM-dd'T'HH:mm:ss,SSS"
]
}
},
Expand Down Expand Up @@ -1558,7 +1558,7 @@ this:
"field" : "timestamp",
"timezone" : "{{ beat.timezone }}",
"formats" : [
"8yyyy-MM-dd'T'HH:mm:ss,SSS"
"yyyy-MM-dd'T'HH:mm:ss,SSS"
]
}
},
Expand Down
12 changes: 8 additions & 4 deletions docs/reference/settings/security-settings.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -480,7 +480,8 @@ and `full`. Defaults to `full`.
See <<ssl-tls-settings,`ssl.verification_mode`>> for an explanation of these values.

`ssl.supported_protocols`::
Supported protocols for TLS/SSL (with versions). Defaults to `TLSv1.2,TLSv1.1`.
Supported protocols for TLS/SSL (with versions). Defaults to `TLSv1.3,TLSv1.2,TLSv1.1` if
the JVM supports TLSv1.3, otherwise `TLSv1.2,TLSv1.1`.

`ssl.cipher_suites`:: Specifies the cipher suites that should be supported when
communicating with the LDAP server.
Expand Down Expand Up @@ -724,7 +725,8 @@ and `full`. Defaults to `full`.
See <<ssl-tls-settings,`ssl.verification_mode`>> for an explanation of these values.

`ssl.supported_protocols`::
Supported protocols for TLS/SSL (with versions). Defaults to `TLSv1.2, TLSv1.1`.
Supported protocols for TLS/SSL (with versions). Defaults to `TLSv1.3,TLSv1.2,TLSv1.1` if
the JVM supports TLSv1.3, otherwise `TLSv1.2,TLSv1.1`.

`ssl.cipher_suites`:: Specifies the cipher suites that should be supported when
communicating with the Active Directory server.
Expand Down Expand Up @@ -1132,7 +1134,8 @@ Defaults to `full`.
See <<ssl-tls-settings,`ssl.verification_mode`>> for a more detailed explanation of these values.

`ssl.supported_protocols`::
Specifies the supported protocols for TLS/SSL.
Specifies the supported protocols for TLS/SSL. Defaults to `TLSv1.3,TLSv1.2,TLSv1.1` if
the JVM supports TLSv1.3, otherwise `TLSv1.2,TLSv1.1`.

`ssl.cipher_suites`::
Specifies the
Expand Down Expand Up @@ -1206,7 +1209,8 @@ settings. For more information, see

`ssl.supported_protocols`::
Supported protocols with versions. Valid protocols: `SSLv2Hello`,
`SSLv3`, `TLSv1`, `TLSv1.1`, `TLSv1.2`. Defaults to `TLSv1.2`, `TLSv1.1`.
`SSLv3`, `TLSv1`, `TLSv1.1`, `TLSv1.2`, `TLSv1.3`. Defaults to `TLSv1.3,TLSv1.2,TLSv1.1` if
the JVM supports TLSv1.3, otherwise `TLSv1.2,TLSv1.1`.
+
--
NOTE: If `xpack.security.fips_mode.enabled` is `true`, you cannot use `SSLv2Hello`
Expand Down
3 changes: 2 additions & 1 deletion docs/reference/settings/ssl-settings.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,8 @@ endif::server[]

+{ssl-prefix}.ssl.supported_protocols+::
Supported protocols with versions. Valid protocols: `SSLv2Hello`,
`SSLv3`, `TLSv1`, `TLSv1.1`, `TLSv1.2`. Defaults to `TLSv1.2`, `TLSv1.1`.
`SSLv3`, `TLSv1`, `TLSv1.1`, `TLSv1.2`, `TLSv1.3`. Defaults to `TLSv1.3,TLSv1.2,TLSv1.1` if
the JVM supports TLSv1.3, otherwise `TLSv1.2,TLSv1.1`.


ifdef::server[]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,14 @@
import javax.net.ssl.X509ExtendedTrustManager;
import java.nio.file.Path;
import java.security.GeneralSecurityException;
import java.util.Arrays;
import java.security.NoSuchAlgorithmException;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Objects;
import java.util.Set;

Expand All @@ -40,6 +43,30 @@
*/
public class SslConfiguration {

/**
* An ordered map of protocol algorithms to SSLContext algorithms. The map is ordered from most
* secure to least secure. The names in this map are taken from the
* <a href="https://docs.oracle.com/en/java/javase/11/docs/specs/security/standard-names.html#sslcontext-algorithms">
* Java Security Standard Algorithm Names Documentation for Java 11</a>.
*/
static final Map<String, String> ORDERED_PROTOCOL_ALGORITHM_MAP;
static {
LinkedHashMap<String, String> protocolAlgorithmMap = new LinkedHashMap<>();
try {
SSLContext.getInstance("TLSv1.3");
protocolAlgorithmMap.put("TLSv1.3", "TLSv1.3");
} catch (NoSuchAlgorithmException e) {
// ignore since we support JVMs that do not support TLSv1.3
}
protocolAlgorithmMap.put("TLSv1.2", "TLSv1.2");
protocolAlgorithmMap.put("TLSv1.1", "TLSv1.1");
protocolAlgorithmMap.put("TLSv1", "TLSv1");
protocolAlgorithmMap.put("SSLv3", "SSLv3");
protocolAlgorithmMap.put("SSLv2", "SSL");
protocolAlgorithmMap.put("SSLv2Hello", "SSL");
ORDERED_PROTOCOL_ALGORITHM_MAP = Collections.unmodifiableMap(protocolAlgorithmMap);
}

private final SslTrustConfig trustConfig;
private final SslKeyConfig keyConfig;
private final SslVerificationMode verificationMode;
Expand Down Expand Up @@ -124,12 +151,13 @@ private String contextProtocol() {
if (supportedProtocols.isEmpty()) {
throw new SslConfigException("no SSL/TLS protocols have been configured");
}
for (String tryProtocol : Arrays.asList("TLSv1.2", "TLSv1.1", "TLSv1", "SSLv3")) {
if (supportedProtocols.contains(tryProtocol)) {
return tryProtocol;
for (Entry<String, String> entry : ORDERED_PROTOCOL_ALGORITHM_MAP.entrySet()) {
if (supportedProtocols.contains(entry.getKey())) {
return entry.getValue();
}
}
return "SSL";
throw new SslConfigException("no supported SSL/TLS protocol was found in the configured supported protocols: "
+ supportedProtocols);
}

@Override
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,12 +26,14 @@
import java.security.NoSuchAlgorithmException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.Objects;
import java.util.function.Function;
import java.util.stream.Collectors;

import static org.elasticsearch.common.ssl.KeyStoreUtil.inferKeyStoreType;
import static org.elasticsearch.common.ssl.SslConfiguration.ORDERED_PROTOCOL_ALGORITHM_MAP;
import static org.elasticsearch.common.ssl.SslConfigurationKeys.CERTIFICATE;
import static org.elasticsearch.common.ssl.SslConfigurationKeys.CERTIFICATE_AUTHORITIES;
import static org.elasticsearch.common.ssl.SslConfigurationKeys.CIPHERS;
Expand Down Expand Up @@ -68,7 +70,9 @@
*/
public abstract class SslConfigurationLoader {

static final List<String> DEFAULT_PROTOCOLS = Arrays.asList("TLSv1.2", "TLSv1.1");
static final List<String> DEFAULT_PROTOCOLS = Collections.unmodifiableList(
ORDERED_PROTOCOL_ALGORITHM_MAP.containsKey("TLSv1.3") ?
Arrays.asList("TLSv1.3", "TLSv1.2", "TLSv1.1") : Arrays.asList("TLSv1.2", "TLSv1.1"));
static final List<String> DEFAULT_CIPHERS = loadDefaultCiphers();
private static final char[] EMPTY_PASSWORD = new char[0];

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -460,9 +460,13 @@ protected QueryBuilder doRewrite(QueryRewriteContext queryShardContext) throws I
@Override
protected void extractInnerHitBuilders(Map<String, InnerHitContextBuilder> innerHits) {
if (innerHitBuilder != null) {
String name = innerHitBuilder.getName() != null ? innerHitBuilder.getName() : type;
if (innerHits.containsKey(name)) {
throw new IllegalArgumentException("[inner_hits] already contains an entry for key [" + name + "]");
}

Map<String, InnerHitContextBuilder> children = new HashMap<>();
InnerHitContextBuilder.extractInnerHits(query, children);
String name = innerHitBuilder.getName() != null ? innerHitBuilder.getName() : type;
InnerHitContextBuilder innerHitContextBuilder =
new ParentChildInnerHitContextBuilder(type, true, query, innerHitBuilder, children);
innerHits.put(name, innerHitContextBuilder);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -285,9 +285,13 @@ protected QueryBuilder doRewrite(QueryRewriteContext queryShardContext) throws I
@Override
protected void extractInnerHitBuilders(Map<String, InnerHitContextBuilder> innerHits) {
if (innerHitBuilder != null) {
String name = innerHitBuilder.getName() != null ? innerHitBuilder.getName() : type;
if (innerHits.containsKey(name)) {
throw new IllegalArgumentException("[inner_hits] already contains an entry for key [" + name + "]");
}

Map<String, InnerHitContextBuilder> children = new HashMap<>();
InnerHitContextBuilder.extractInnerHits(query, children);
String name = innerHitBuilder.getName() != null ? innerHitBuilder.getName() : type;
InnerHitContextBuilder innerHitContextBuilder =
new ParentChildInnerHitContextBuilder(type, false, query, innerHitBuilder, children);
innerHits.put(name, innerHitContextBuilder);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -367,4 +367,12 @@ public void testIgnoreUnmappedWithRewrite() throws IOException {
assertThat(query, notNullValue());
assertThat(query, instanceOf(MatchNoDocsQuery.class));
}

public void testExtractInnerHitBuildersWithDuplicate() {
final HasChildQueryBuilder queryBuilder
= new HasChildQueryBuilder(CHILD_DOC, new WrapperQueryBuilder(new MatchAllQueryBuilder().toString()), ScoreMode.None);
queryBuilder.innerHit(new InnerHitBuilder("some_name"));
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> InnerHitContextBuilder.extractInnerHits(queryBuilder, Collections.singletonMap("some_name", null)));
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -268,4 +268,12 @@ public void testIgnoreUnmappedWithRewrite() throws IOException {
assertThat(query, notNullValue());
assertThat(query, instanceOf(MatchNoDocsQuery.class));
}

public void testExtractInnerHitBuildersWithDuplicate() {
final HasParentQueryBuilder queryBuilder
= new HasParentQueryBuilder(CHILD_DOC, new WrapperQueryBuilder(new MatchAllQueryBuilder().toString()), false);
queryBuilder.innerHit(new InnerHitBuilder("some_name"));
IllegalArgumentException e = expectThrows(IllegalArgumentException.class,
() -> InnerHitContextBuilder.extractInnerHits(queryBuilder, Collections.singletonMap("some_name", null)));
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -143,6 +143,7 @@ public void testClientSucceedsWithCertificateAuthorities() throws IOException {
}
}

@AwaitsFix(bugUrl = "https://github.com/elastic/elasticsearch/issues/38212")
public void testClientSucceedsWithVerificationDisabled() throws IOException {
assertFalse("Cannot disable verification in FIPS JVM", inFipsJvm());
final List<Thread> threads = new ArrayList<>();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,8 @@
"PUT mapping with _doc on an index that has types":

- skip:
version: " - 6.6.99"
reason: include_type_name is only supported as of 6.7
version: "all"
reason: include_type_name is only supported as of 6.7 # AwaitsFix: https://github.com/elastic/elasticsearch/issues/38202


- do:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -205,17 +205,14 @@ public SearchRequest(StreamInput in) throws IOException {
localClusterAlias = in.readOptionalString();
if (localClusterAlias != null) {
absoluteStartMillis = in.readVLong();
finalReduce = in.readBoolean();
} else {
absoluteStartMillis = DEFAULT_ABSOLUTE_START_MILLIS;
finalReduce = true;
}
} else {
localClusterAlias = null;
absoluteStartMillis = DEFAULT_ABSOLUTE_START_MILLIS;
}
//TODO move to the 6_7_0 branch once backported to 6.x
if (in.getVersion().onOrAfter(Version.V_7_0_0)) {
finalReduce = in.readBoolean();
} else {
finalReduce = true;
}
if (in.getVersion().onOrAfter(Version.V_7_0_0)) {
Expand Down Expand Up @@ -245,12 +242,9 @@ public void writeTo(StreamOutput out) throws IOException {
out.writeOptionalString(localClusterAlias);
if (localClusterAlias != null) {
out.writeVLong(absoluteStartMillis);
out.writeBoolean(finalReduce);
}
}
//TODO move to the 6_7_0 branch once backported to 6.x
if (out.getVersion().onOrAfter(Version.V_7_0_0)) {
out.writeBoolean(finalReduce);
}
if (out.getVersion().onOrAfter(Version.V_7_0_0)) {
out.writeBoolean(ccsMinimizeRoundtrips);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1131,6 +1131,21 @@ public Iterable<DiscoveryNode> getFoundPeers() {
return peerFinder.getFoundPeers();
}

/**
* If there is any current committed publication, this method cancels it.
* This method is used exclusively by tests.
* @return true if publication was cancelled, false if there is no current committed publication.
*/
boolean cancelCommittedPublication() {
synchronized (mutex) {
if (currentPublication.isPresent() && currentPublication.get().isCommitted()) {
currentPublication.get().cancel("cancelCommittedPublication");
return true;
}
return false;
}
}

class CoordinatorPublication extends Publication {

private final PublishRequest publishRequest;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -147,16 +147,7 @@ public boolean match(DiscoveryNode node) {
}
} else if ("_host".equals(attr)) {
for (String value : values) {
if (Regex.simpleMatch(value, node.getHostName())) {
if (opType == OpType.OR) {
return true;
}
} else {
if (opType == OpType.AND) {
return false;
}
}
if (Regex.simpleMatch(value, node.getHostAddress())) {
if (Regex.simpleMatch(value, node.getHostName()) || Regex.simpleMatch(value, node.getHostAddress())) {
if (opType == OpType.OR) {
return true;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -317,10 +317,14 @@ protected QueryBuilder doRewrite(QueryRewriteContext queryRewriteContext) throws
@Override
public void extractInnerHitBuilders(Map<String, InnerHitContextBuilder> innerHits) {
if (innerHitBuilder != null) {
String name = innerHitBuilder.getName() != null ? innerHitBuilder.getName() : path;
if (innerHits.containsKey(name)) {
throw new IllegalArgumentException("[inner_hits] already contains an entry for key [" + name + "]");
}

Map<String, InnerHitContextBuilder> children = new HashMap<>();
InnerHitContextBuilder.extractInnerHits(query, children);
InnerHitContextBuilder innerHitContextBuilder = new NestedInnerHitContextBuilder(path, query, innerHitBuilder, children);
String name = innerHitBuilder.getName() != null ? innerHitBuilder.getName() : path;
innerHits.put(name, innerHitContextBuilder);
}
}
Expand Down
Loading

0 comments on commit 686d35e

Please sign in to comment.