Skip to content

Commit

Permalink
Merge branch 'ccr' into engine-factory-provider
Browse files Browse the repository at this point in the history
* ccr: (42 commits)
  [DOCS] Added info about snapshotting your data before an upgrade.
  Add documentation about disabling `_field_names`. (elastic#26813)
  Remove UnsortedNumericDoubleValues (elastic#26817)
  Fix IndexOutOfBoundsException in histograms for NaN doubles (elastic#26787) (elastic#26856)
  [TEST] Added skipping the `headers` feature to the Bulk REST YAML test
  Update type-field.asciidoc
  Fix search_after with geo distance sorting (elastic#26891)
  Use proper logging placeholder for Netty logging
  Add Netty channel information on write and flush failure
  Remove deploying in JBoss documentation
  Document JVM option MaxFDLimit for macOS ()
  Add additional low-level logging handler ()
  Unwrap causes when maybe dying
  Change log level on write and flush failure to warn
  [TEST] add test to ensure legacy list syntax in yml works fine
  Bump BWC version for settings serialization to 6.1.0
  Removed void token filter entries and added two tests
  Added Bengali Analyzer to Elasticsearch with respect to the lucene update(PR#238)
  Fix toString() in SnapshotStatus (elastic#26852)
  elastic#26870 change bwc version for fuzzy_transpositions to 6.1 after backport
  ...
  • Loading branch information
jasontedor committed Oct 6, 2017
2 parents 0ba19c6 + b57cb83 commit 5506cfd
Show file tree
Hide file tree
Showing 195 changed files with 2,310 additions and 1,131 deletions.
1 change: 1 addition & 0 deletions TESTING.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -352,6 +352,7 @@ These are the linux flavors the Vagrantfile currently supports:
* centos-6
* centos-7
* fedora-25
* fedora-26
* oel-6 aka Oracle Enterprise Linux 6
* oel-7 aka Oracle Enterprise Linux 7
* sles-12
Expand Down
4 changes: 4 additions & 0 deletions Vagrantfile
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,10 @@ Vagrant.configure(2) do |config|
config.vm.box = "elastic/fedora-25-x86_64"
dnf_common config
end
config.vm.define "fedora-26" do |config|
config.vm.box = "elastic/fedora-26-x86_64"
dnf_common config
end
config.vm.define "opensuse-42" do |config|
config.vm.box = "elastic/opensuse-42-x86_64"
opensuse_common config
Expand Down
2 changes: 1 addition & 1 deletion buildSrc/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ dependencies {
compile 'com.netflix.nebula:gradle-info-plugin:3.0.3'
compile 'org.eclipse.jgit:org.eclipse.jgit:3.2.0.201312181205-r'
compile 'com.perforce:p4java:2012.3.551082' // THIS IS SUPPOSED TO BE OPTIONAL IN THE FUTURE....
compile 'de.thetaphi:forbiddenapis:2.3'
compile 'de.thetaphi:forbiddenapis:2.4.1'
compile 'org.apache.rat:apache-rat:0.11'
compile "org.elasticsearch:jna:4.4.0-1"
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ class VagrantTestPlugin implements Plugin<Project> {
'debian-8',
'debian-9',
'fedora-25',
'fedora-26',
'oel-6',
'oel-7',
'opensuse-42',
Expand Down
2 changes: 1 addition & 1 deletion buildSrc/version.properties
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# When updating elasticsearch, please update 'rest' version in core/src/main/resources/org/elasticsearch/bootstrap/test-framework.policy
elasticsearch = 7.0.0-alpha1
lucene = 7.0.0
lucene = 7.1.0-snapshot-f33ed4ba12a

# optional dependencies
spatial4j = 0.6
Expand Down
5 changes: 3 additions & 2 deletions core/licenses/lucene-NOTICE.txt
Original file line number Diff line number Diff line change
Expand Up @@ -54,13 +54,14 @@ The KStem stemmer in
was developed by Bob Krovetz and Sergio Guzman-Lara (CIIR-UMass Amherst)
under the BSD-license.

The Arabic,Persian,Romanian,Bulgarian, and Hindi analyzers (common) come with a default
The Arabic,Persian,Romanian,Bulgarian, Hindi and Bengali analyzers (common) come with a default
stopword list that is BSD-licensed created by Jacques Savoy. These files reside in:
analysis/common/src/resources/org/apache/lucene/analysis/ar/stopwords.txt,
analysis/common/src/resources/org/apache/lucene/analysis/fa/stopwords.txt,
analysis/common/src/resources/org/apache/lucene/analysis/ro/stopwords.txt,
analysis/common/src/resources/org/apache/lucene/analysis/bg/stopwords.txt,
analysis/common/src/resources/org/apache/lucene/analysis/hi/stopwords.txt
analysis/common/src/resources/org/apache/lucene/analysis/hi/stopwords.txt,
analysis/common/src/resources/org/apache/lucene/analysis/bn/stopwords.txt
See http://members.unine.ch/jacques.savoy/clef/index.html.

The German,Spanish,Finnish,French,Hungarian,Italian,Portuguese,Russian and Swedish light stemmers
Expand Down
1 change: 0 additions & 1 deletion core/licenses/lucene-analyzers-common-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
a59ac3bdd17becc848f319fb77994060661c2c71
1 change: 0 additions & 1 deletion core/licenses/lucene-backward-codecs-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
47f560086db8683b5be26911fae3721d8b0da465
1 change: 0 additions & 1 deletion core/licenses/lucene-core-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
17bd8e886ac2e763c27a507e697f78e43103afd3
1 change: 0 additions & 1 deletion core/licenses/lucene-grouping-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
bb7d5f5f6dd0bada3991828b8687a35c90de76ca
1 change: 0 additions & 1 deletion core/licenses/lucene-highlighter-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
f024368b33bfb7c1589aaf424992e474c4e3be38
1 change: 0 additions & 1 deletion core/licenses/lucene-join-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
7b525cb2e2c8403543fefc09b972c78b86d2f0da
1 change: 0 additions & 1 deletion core/licenses/lucene-memory-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
61cc3ced15fa80d8f97affe0c8df9818eeb8af49
1 change: 0 additions & 1 deletion core/licenses/lucene-misc-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
03a71b5875d25576c9f8992822db65fb181f4328
1 change: 0 additions & 1 deletion core/licenses/lucene-queries-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
9c07c15b2c6f8bd3d75e0f53fff5631f012bff98
1 change: 0 additions & 1 deletion core/licenses/lucene-queryparser-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
e0a7815981d096d96e7dc41b1c063cd78c91132d
1 change: 0 additions & 1 deletion core/licenses/lucene-sandbox-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
1ea14867a6bc545fb2e09dd1f31b48523cdbc040
1 change: 0 additions & 1 deletion core/licenses/lucene-spatial-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
58ce824ebc6126e37ff232c96a561a659377a873
1 change: 0 additions & 1 deletion core/licenses/lucene-spatial-extras-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
3fcd89a8cda5ee2049c189b06b5e30258b1aa198
1 change: 0 additions & 1 deletion core/licenses/lucene-spatial3d-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
1d1ada8fbb1b2bbbc88e9f29e28802a7b44a6665
1 change: 0 additions & 1 deletion core/licenses/lucene-suggest-7.0.0.jar.sha1

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
fb7f18e6a81899e3ac95760b56bea21ebf143cf9
4 changes: 2 additions & 2 deletions core/src/main/java/org/elasticsearch/Version.java
Original file line number Diff line number Diff line change
Expand Up @@ -118,10 +118,10 @@ public class Version implements Comparable<Version> {
new Version(V_6_0_0_rc2_ID, org.apache.lucene.util.Version.LUCENE_7_0_0);
public static final int V_6_1_0_ID = 6010099;
public static final Version V_6_1_0 =
new Version(V_6_1_0_ID, org.apache.lucene.util.Version.LUCENE_7_0_0);
new Version(V_6_1_0_ID, org.apache.lucene.util.Version.LUCENE_7_1_0);
public static final int V_7_0_0_alpha1_ID = 7000001;
public static final Version V_7_0_0_alpha1 =
new Version(V_7_0_0_alpha1_ID, org.apache.lucene.util.Version.LUCENE_7_0_0);
new Version(V_7_0_0_alpha1_ID, org.apache.lucene.util.Version.LUCENE_7_1_0);
public static final Version CURRENT = V_7_0_0_alpha1;

// unreleased versions must be added to the above list with the suffix _UNRELEASED (with the exception of CURRENT)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,12 @@
package org.elasticsearch.action.admin.cluster.snapshots.status;

import org.elasticsearch.cluster.SnapshotsInProgress.State;
import org.elasticsearch.common.Strings;
import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.io.stream.Streamable;
import org.elasticsearch.common.xcontent.ToXContent.Params;
import org.elasticsearch.common.xcontent.ToXContentObject;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.common.xcontent.XContentFactory;
import org.elasticsearch.snapshots.Snapshot;

import java.io.IOException;
Expand Down Expand Up @@ -160,15 +159,7 @@ public static SnapshotStatus readSnapshotStatus(StreamInput in) throws IOExcepti

@Override
public String toString() {
try {
XContentBuilder builder = XContentFactory.jsonBuilder().prettyPrint();
builder.startObject();
toXContent(builder, EMPTY_PARAMS);
builder.endObject();
return builder.string();
} catch (IOException e) {
return "{ \"error\" : \"" + e.getMessage() + "\"}";
}
return Strings.toString(this, true, false);
}

/**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -83,14 +83,8 @@ protected void masterOperation(GetSettingsRequest request, ClusterState state, A
if (request.humanReadable()) {
settings = IndexMetaData.addHumanReadableSettings(settings);
}
if (!CollectionUtils.isEmpty(request.names())) {
Settings.Builder settingsBuilder = Settings.builder();
for (Map.Entry<String, String> entry : settings.getAsMap().entrySet()) {
if (Regex.simpleMatch(request.names(), entry.getKey())) {
settingsBuilder.put(entry.getKey(), entry.getValue());
}
}
settings = settingsBuilder.build();
if (CollectionUtils.isEmpty(request.names()) == false) {
settings = settings.filter(k -> Regex.simpleMatch(request.names(), k));
}
indexToSettingsBuilder.put(concreteIndex.getName(), settings);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@
import org.elasticsearch.search.fetch.ShardFetchRequest;
import org.elasticsearch.search.fetch.ShardFetchSearchRequest;
import org.elasticsearch.search.internal.InternalScrollSearchRequest;
import org.elasticsearch.search.internal.ShardSearchRequest;
import org.elasticsearch.search.internal.ShardSearchTransportRequest;
import org.elasticsearch.search.query.QuerySearchRequest;
import org.elasticsearch.search.query.QuerySearchResult;
Expand Down Expand Up @@ -320,7 +321,8 @@ public void messageReceived(ScrollFreeContextRequest request, TransportChannel c
channel.sendResponse(new SearchFreeContextResponse(freed));
}
});
TransportActionProxy.registerProxyAction(transportService, FREE_CONTEXT_SCROLL_ACTION_NAME, SearchFreeContextResponse::new);
TransportActionProxy.registerProxyAction(transportService, FREE_CONTEXT_SCROLL_ACTION_NAME,
(Supplier<TransportResponse>) SearchFreeContextResponse::new);
transportService.registerRequestHandler(FREE_CONTEXT_ACTION_NAME, ThreadPool.Names.SAME, SearchFreeContextRequest::new,
new TaskAwareTransportRequestHandler<SearchFreeContextRequest>() {
@Override
Expand All @@ -329,7 +331,8 @@ public void messageReceived(SearchFreeContextRequest request, TransportChannel c
channel.sendResponse(new SearchFreeContextResponse(freed));
}
});
TransportActionProxy.registerProxyAction(transportService, FREE_CONTEXT_ACTION_NAME, SearchFreeContextResponse::new);
TransportActionProxy.registerProxyAction(transportService, FREE_CONTEXT_ACTION_NAME,
(Supplier<TransportResponse>) SearchFreeContextResponse::new);
transportService.registerRequestHandler(CLEAR_SCROLL_CONTEXTS_ACTION_NAME, () -> TransportRequest.Empty.INSTANCE,
ThreadPool.Names.SAME, new TaskAwareTransportRequestHandler<TransportRequest.Empty>() {
@Override
Expand All @@ -339,7 +342,7 @@ public void messageReceived(TransportRequest.Empty request, TransportChannel cha
}
});
TransportActionProxy.registerProxyAction(transportService, CLEAR_SCROLL_CONTEXTS_ACTION_NAME,
() -> TransportResponse.Empty.INSTANCE);
() -> TransportResponse.Empty.INSTANCE);

transportService.registerRequestHandler(DFS_ACTION_NAME, ThreadPool.Names.SAME, ShardSearchTransportRequest::new,
new TaskAwareTransportRequestHandler<ShardSearchTransportRequest>() {
Expand Down Expand Up @@ -394,7 +397,8 @@ public void onFailure(Exception e) {
});
}
});
TransportActionProxy.registerProxyAction(transportService, QUERY_ACTION_NAME, QuerySearchResult::new);
TransportActionProxy.registerProxyAction(transportService, QUERY_ACTION_NAME,
(request) -> ((ShardSearchRequest)request).numberOfShards() == 1 ? QueryFetchSearchResult::new : QuerySearchResult::new);

transportService.registerRequestHandler(QUERY_ID_ACTION_NAME, ThreadPool.Names.SEARCH, QuerySearchRequest::new,
new TaskAwareTransportRequestHandler<QuerySearchRequest>() {
Expand Down Expand Up @@ -455,7 +459,8 @@ public void messageReceived(ShardSearchTransportRequest request, TransportChanne
channel.sendResponse(new CanMatchResponse(canMatch));
}
});
TransportActionProxy.registerProxyAction(transportService, QUERY_CAN_MATCH_NAME, CanMatchResponse::new);
TransportActionProxy.registerProxyAction(transportService, QUERY_CAN_MATCH_NAME,
(Supplier<TransportResponse>) CanMatchResponse::new);
}

public static final class CanMatchResponse extends SearchPhaseResult {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1078,9 +1078,7 @@ public static void toXContent(IndexMetaData indexMetaData, XContentBuilder build
boolean binary = params.paramAsBoolean("binary", false);

builder.startObject(KEY_SETTINGS);
for (Map.Entry<String, String> entry : indexMetaData.getSettings().getAsMap().entrySet()) {
builder.field(entry.getKey(), entry.getValue());
}
indexMetaData.getSettings().toXContent(builder, new MapParams(Collections.singletonMap("flat_settings", "true")));
builder.endObject();

builder.startArray(KEY_MAPPINGS);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1000,17 +1000,13 @@ public static void toXContent(MetaData metaData, XContentBuilder builder, ToXCon

if (!metaData.persistentSettings().isEmpty()) {
builder.startObject("settings");
for (Map.Entry<String, String> entry : metaData.persistentSettings().getAsMap().entrySet()) {
builder.field(entry.getKey(), entry.getValue());
}
metaData.persistentSettings().toXContent(builder, new MapParams(Collections.singletonMap("flat_settings", "true")));
builder.endObject();
}

if (context == XContentContext.API && !metaData.transientSettings().isEmpty()) {
builder.startObject("transient_settings");
for (Map.Entry<String, String> entry : metaData.transientSettings().getAsMap().entrySet()) {
builder.field(entry.getKey(), entry.getValue());
}
metaData.transientSettings().toXContent(builder, new MapParams(Collections.singletonMap("flat_settings", "true")));
builder.endObject();
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ public void updateSettings(final UpdateSettingsClusterStateUpdateRequest request

indexScopedSettings.validate(normalizedSettings);
// never allow to change the number of shards
for (String key : normalizedSettings.getKeys()) {
for (String key : normalizedSettings.keySet()) {
Setting setting = indexScopedSettings.get(key);
assert setting != null; // we already validated the normalized settings
settingsForClosedIndices.copy(key, normalizedSettings);
Expand Down Expand Up @@ -211,8 +211,7 @@ public ClusterState execute(ClusterState currentState) {

if (!skippedSettings.isEmpty() && !openIndices.isEmpty()) {
throw new IllegalArgumentException(String.format(Locale.ROOT,
"Can't update non dynamic settings [%s] for open indices %s", skippedSettings, openIndices
));
"Can't update non dynamic settings [%s] for open indices %s", skippedSettings, openIndices));
}

int updatedNumberOfReplicas = openSettings.getAsInt(IndexMetaData.SETTING_NUMBER_OF_REPLICAS, -1);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,6 @@
import java.util.HashMap;
import java.util.Map;
import java.util.function.BiConsumer;
import java.util.function.Consumer;

public class DiscoveryNodeFilters {

Expand All @@ -56,10 +55,6 @@ public enum OpType {
}
};

public static DiscoveryNodeFilters buildFromSettings(OpType opType, String prefix, Settings settings) {
return buildFromKeyValue(opType, settings.getByPrefix(prefix).getAsMap());
}

public static DiscoveryNodeFilters buildFromKeyValue(OpType opType, Map<String, String> filters) {
Map<String, String[]> bFilters = new HashMap<>();
for (Map.Entry<String, String> entry : filters.entrySet()) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ private ESLoggerFactory() {

public static final Setting<Level> LOG_DEFAULT_LEVEL_SETTING =
new Setting<>("logger.level", Level.INFO.name(), Level::valueOf, Property.NodeScope);
public static final Setting<Level> LOG_LEVEL_SETTING =
public static final Setting.AffixSetting<Level> LOG_LEVEL_SETTING =
Setting.prefixKeySetting("logger.", (key) -> new Setting<>(key, Level.INFO.name(), Level::valueOf, Property.Dynamic,
Property.NodeScope));

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@
import java.util.ArrayList;
import java.util.EnumSet;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Set;
import java.util.concurrent.atomic.AtomicBoolean;
Expand Down Expand Up @@ -182,15 +181,12 @@ private static void configureLoggerLevels(final Settings settings) {
final Level level = ESLoggerFactory.LOG_DEFAULT_LEVEL_SETTING.get(settings);
Loggers.setLevel(ESLoggerFactory.getRootLogger(), level);
}

final Map<String, String> levels = settings.filter(ESLoggerFactory.LOG_LEVEL_SETTING::match).getAsMap();
for (final String key : levels.keySet()) {
ESLoggerFactory.LOG_LEVEL_SETTING.getAllConcreteSettings(settings)
// do not set a log level for a logger named level (from the default log setting)
if (!key.equals(ESLoggerFactory.LOG_DEFAULT_LEVEL_SETTING.getKey())) {
final Level level = ESLoggerFactory.LOG_LEVEL_SETTING.getConcreteSetting(key).get(settings);
Loggers.setLevel(ESLoggerFactory.getLogger(key.substring("logger.".length())), level);
}
}
.filter(s -> s.getKey().equals(ESLoggerFactory.LOG_DEFAULT_LEVEL_SETTING.getKey()) == false).forEach(s -> {
final Level level = s.get(settings);
Loggers.setLevel(ESLoggerFactory.getLogger(s.getKey().substring("logger.".length())), level);
});
}

/**
Expand Down
Loading

0 comments on commit 5506cfd

Please sign in to comment.