Skip to content

Commit

Permalink
Merge branch 'master' into feature/109732-synthetics-integration-new-…
Browse files Browse the repository at this point in the history
…advanced-options
  • Loading branch information
kibanamachine authored Oct 11, 2021
2 parents 44f2f94 + 804da09 commit 3b437ba
Show file tree
Hide file tree
Showing 170 changed files with 3,798 additions and 1,015 deletions.
2 changes: 1 addition & 1 deletion docs/settings/security-settings.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -218,7 +218,7 @@ There is a very limited set of cases when you'd want to change these settings. F
| Determines if HTTP authentication schemes used by the enabled authentication providers should be automatically supported during HTTP authentication. By default, this setting is set to `true`.

| `xpack.security.authc.http.schemes[]`
| List of HTTP authentication schemes that {kib} HTTP authentication should support. By default, this setting is set to `['apikey']` to support HTTP authentication with <<api-keys, `ApiKey`>> scheme.
| List of HTTP authentication schemes that {kib} HTTP authentication should support. By default, this setting is set to `['apikey', 'bearer']` to support HTTP authentication with the <<api-keys, `ApiKey`>> and <<http-authentication, `Bearer`>> schemes.

|===

Expand Down
6 changes: 3 additions & 3 deletions docs/user/security/authentication/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -437,14 +437,14 @@ This type of authentication is usually useful for machine-to-machine interaction

By default {kib} supports <<api-keys, `ApiKey`>> authentication scheme _and_ any scheme supported by the currently enabled authentication provider. For example, `Basic` authentication scheme is automatically supported when basic authentication provider is enabled, or `Bearer` scheme when any of the token based authentication providers is enabled (Token, SAML, OpenID Connect, PKI or Kerberos). But it's also possible to add support for any other authentication scheme in the `kibana.yml` configuration file, as follows:

NOTE: Don't forget to explicitly specify default `apikey` scheme when you just want to add a new one to the list.
NOTE: Don't forget to explicitly specify the default `apikey` and `bearer` schemes when you just want to add a new one to the list.

[source,yaml]
--------------------------------------------------------------------------------
xpack.security.authc.http.schemes: [apikey, basic, something-custom]
xpack.security.authc.http.schemes: [apikey, bearer, basic, something-custom]
--------------------------------------------------------------------------------

With this configuration, you can send requests to {kib} with the `Authorization` header using `ApiKey`, `Basic` or `Something-Custom` HTTP schemes (case insensitive). Under the hood, {kib} relays this header to {es}, then {es} authenticates the request using the credentials in the header.
With this configuration, you can send requests to {kib} with the `Authorization` header using `ApiKey`, `Bearer`, `Basic` or `Something-Custom` HTTP schemes (case insensitive). Under the hood, {kib} relays this header to {es}, then {es} authenticates the request using the credentials in the header.

[float]
[[embedded-content-authentication]]
Expand Down
10 changes: 10 additions & 0 deletions renovate.json5
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,16 @@
labels: ['Team:Operations', 'release_note:skip'],
enabled: true,
},
{
groupName: 'polyfills',
packageNames: ['core-js'],
matchPackagePatterns: ["polyfill"],
excludePackageNames: ['@loaders.gl/polyfills'],
reviewers: ['team:kibana-operations'],
matchBaseBranches: ['master'],
labels: ['Team:Operations', 'release_note:skip'],
enabled: true,
},
{
groupName: 'vega related modules',
packageNames: ['vega', 'vega-lite', 'vega-schema-url-parser', 'vega-tooltip'],
Expand Down
1 change: 1 addition & 0 deletions src/plugins/bfetch/common/util/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,4 @@

export * from './normalize_error';
export * from './remove_leading_slash';
export * from './query_params';
12 changes: 12 additions & 0 deletions src/plugins/bfetch/common/util/query_params.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/

export const appendQueryParam = (url: string, key: string, value: string): string => {
const separator = url.includes('?') ? '&' : '?';
return `${url}${separator}${key}=${value}`;
};
12 changes: 6 additions & 6 deletions src/plugins/bfetch/public/streaming/fetch_streaming.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ import { map, share } from 'rxjs/operators';
import { inflateResponse } from '.';
import { fromStreamingXhr } from './from_streaming_xhr';
import { split } from './split';
import { appendQueryParam } from '../../common';

export interface FetchStreamingParams {
url: string;
Expand All @@ -34,16 +35,15 @@ export function fetchStreaming({
}: FetchStreamingParams) {
const xhr = new window.XMLHttpRequest();

// Begin the request
xhr.open(method, url);
xhr.withCredentials = true;

const isCompressionDisabled = getIsCompressionDisabled();

if (!isCompressionDisabled) {
headers['X-Chunk-Encoding'] = 'deflate';
url = appendQueryParam(url, 'compress', 'true');
}

// Begin the request
xhr.open(method, url);
xhr.withCredentials = true;

// Set the HTTP headers
Object.entries(headers).forEach(([k, v]) => xhr.setRequestHeader(k, v));

Expand Down
1 change: 0 additions & 1 deletion src/plugins/bfetch/server/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ import { PluginInitializerContext } from '../../../core/server';
import { BfetchServerPlugin } from './plugin';

export { BfetchServerSetup, BfetchServerStart, BatchProcessingRouteParams } from './plugin';
export { StreamingRequestHandler } from './types';

export function plugin(initializerContext: PluginInitializerContext) {
return new BfetchServerPlugin(initializerContext);
Expand Down
1 change: 0 additions & 1 deletion src/plugins/bfetch/server/mocks.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ const createSetupContract = (): Setup => {
const setupContract: Setup = {
addBatchProcessingRoute: jest.fn(),
addStreamingResponseRoute: jest.fn(),
createStreamingRequestHandler: jest.fn(),
};
return setupContract;
};
Expand Down
78 changes: 3 additions & 75 deletions src/plugins/bfetch/server/plugin.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,6 @@ import type {
Plugin,
Logger,
KibanaRequest,
RouteMethod,
RequestHandler,
RequestHandlerContext,
StartServicesAccessor,
} from 'src/core/server';
import { schema } from '@kbn/config-schema';
Expand All @@ -28,7 +25,6 @@ import {
removeLeadingSlash,
normalizeError,
} from '../common';
import { StreamingRequestHandler } from './types';
import { createStream } from './streaming';
import { getUiSettings } from './ui_settings';

Expand All @@ -52,44 +48,6 @@ export interface BfetchServerSetup {
path: string,
params: (request: KibanaRequest) => StreamingResponseHandler<Payload, Response>
) => void;
/**
* Create a streaming request handler to be able to use an Observable to return chunked content to the client.
* This is meant to be used with the `fetchStreaming` API of the `bfetch` client-side plugin.
*
* @example
* ```ts
* setup({ http }: CoreStart, { bfetch }: SetupDeps) {
* const router = http.createRouter();
* router.post(
* {
* path: '/api/my-plugin/stream-endpoint,
* validate: {
* body: schema.object({
* term: schema.string(),
* }),
* }
* },
* bfetch.createStreamingResponseHandler(async (ctx, req) => {
* const { term } = req.body;
* const results$ = await myApi.getResults$(term);
* return results$;
* })
* )}
*
* ```
*
* @param streamHandler
*/
createStreamingRequestHandler: <
Response,
P,
Q,
B,
Context extends RequestHandlerContext = RequestHandlerContext,
Method extends RouteMethod = any
>(
streamHandler: StreamingRequestHandler<Response, P, Q, B, Method>
) => RequestHandler<P, Q, B, Context, Method>;
}

// eslint-disable-next-line
Expand Down Expand Up @@ -124,15 +82,10 @@ export class BfetchServerPlugin
logger,
});
const addBatchProcessingRoute = this.addBatchProcessingRoute(addStreamingResponseRoute);
const createStreamingRequestHandler = this.createStreamingRequestHandler({
getStartServices: core.getStartServices,
logger,
});

return {
addBatchProcessingRoute,
addStreamingResponseRoute,
createStreamingRequestHandler,
};
}

Expand All @@ -142,10 +95,6 @@ export class BfetchServerPlugin

public stop() {}

private getCompressionDisabled(request: KibanaRequest) {
return request.headers['x-chunk-encoding'] !== 'deflate';
}

private addStreamingResponseRoute =
({
getStartServices,
Expand All @@ -162,42 +111,21 @@ export class BfetchServerPlugin
path: `/${removeLeadingSlash(path)}`,
validate: {
body: schema.any(),
query: schema.object({ compress: schema.boolean({ defaultValue: false }) }),
},
},
async (context, request, response) => {
const handlerInstance = handler(request);
const data = request.body;
const compressionDisabled = this.getCompressionDisabled(request);
const compress = request.query.compress;
return response.ok({
headers: streamingHeaders,
body: createStream(
handlerInstance.getResponseStream(data),
logger,
compressionDisabled
),
body: createStream(handlerInstance.getResponseStream(data), logger, compress),
});
}
);
};

private createStreamingRequestHandler =
({
logger,
getStartServices,
}: {
logger: Logger;
getStartServices: StartServicesAccessor;
}): BfetchServerSetup['createStreamingRequestHandler'] =>
(streamHandler) =>
async (context, request, response) => {
const response$ = await streamHandler(context, request);
const compressionDisabled = this.getCompressionDisabled(request);
return response.ok({
headers: streamingHeaders,
body: createStream(response$, logger, compressionDisabled),
});
};

private addBatchProcessingRoute =
(
addStreamingResponseRoute: BfetchServerSetup['addStreamingResponseRoute']
Expand Down
8 changes: 4 additions & 4 deletions src/plugins/bfetch/server/streaming/create_stream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@ import { createNDJSONStream } from './create_ndjson_stream';
export function createStream<Payload, Response>(
response$: Observable<Response>,
logger: Logger,
compressionDisabled: boolean
compress: boolean
): Stream {
return compressionDisabled
? createNDJSONStream(response$, logger)
: createCompressedStream(response$, logger);
return compress
? createCompressedStream(response$, logger)
: createNDJSONStream(response$, logger);
}
27 changes: 0 additions & 27 deletions src/plugins/bfetch/server/types.ts

This file was deleted.

66 changes: 30 additions & 36 deletions test/api_integration/apis/search/bsearch.ts
Original file line number Diff line number Diff line change
Expand Up @@ -29,28 +29,25 @@ export default function ({ getService }: FtrProviderContext) {
describe('bsearch', () => {
describe('post', () => {
it('should return 200 a single response', async () => {
const resp = await supertest
.post(`/internal/bsearch`)
.set({ 'X-Chunk-Encoding': '' })
.send({
batch: [
{
request: {
params: {
index: '.kibana',
body: {
query: {
match_all: {},
},
const resp = await supertest.post(`/internal/bsearch`).send({
batch: [
{
request: {
params: {
index: '.kibana',
body: {
query: {
match_all: {},
},
},
},
options: {
strategy: 'es',
},
},
],
});
options: {
strategy: 'es',
},
},
],
});

const jsonBody = parseBfetchResponse(resp);

Expand All @@ -62,28 +59,25 @@ export default function ({ getService }: FtrProviderContext) {
});

it('should return 200 a single response from compressed', async () => {
const resp = await supertest
.post(`/internal/bsearch`)
.set({ 'X-Chunk-Encoding': 'deflate' })
.send({
batch: [
{
request: {
params: {
index: '.kibana',
body: {
query: {
match_all: {},
},
const resp = await supertest.post(`/internal/bsearch?compress=true`).send({
batch: [
{
request: {
params: {
index: '.kibana',
body: {
query: {
match_all: {},
},
},
},
options: {
strategy: 'es',
},
},
],
});
options: {
strategy: 'es',
},
},
],
});

const jsonBody = parseBfetchResponse(resp, true);

Expand Down
15 changes: 8 additions & 7 deletions vars/tasks.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -146,13 +146,14 @@ def functionalXpack(Map params = [:]) {
}
}

whenChanged([
'x-pack/plugins/apm/',
]) {
if (githubPr.isPr()) {
task(kibanaPipeline.functionalTestProcess('xpack-APMCypress', './test/scripts/jenkins_apm_cypress.sh'))
}
}
//temporarily disable apm e2e test since it's breaking.
// whenChanged([
// 'x-pack/plugins/apm/',
// ]) {
// if (githubPr.isPr()) {
// task(kibanaPipeline.functionalTestProcess('xpack-APMCypress', './test/scripts/jenkins_apm_cypress.sh'))
// }
// }

whenChanged([
'x-pack/plugins/uptime/',
Expand Down
Loading

0 comments on commit 3b437ba

Please sign in to comment.