Skip to content

Commit

Permalink
Merge branch 'master' into dev/data-plugin-wrong-import
Browse files Browse the repository at this point in the history
  • Loading branch information
kibanamachine authored Mar 4, 2021
2 parents c0e7e9b + 284a77c commit 28d55ef
Show file tree
Hide file tree
Showing 32 changed files with 375 additions and 193 deletions.
4 changes: 2 additions & 2 deletions dev_docs/kibana_platform_plugin_intro.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ We will continue to focus on adding clarity around these types of services and w

### Core services

Sometimes referred to just as Core, Core services provide the most basic and fundamental tools neccessary for building a plugin, like creating saved objects,
Sometimes referred to just as <DocLink id="kibServerAndCoreComponents" text="Core, Core services"/> provide the most basic and fundamental tools neccessary for building a plugin, like creating saved objects,
routing, application registration, notifications and <DocLink id="kibCoreLogging" text="logging"/>. The Core platform is not a plugin itself, although
there are some plugins that provide platform functionality. We call these <DocLink id="kibPlatformIntro" section="platform-plugins" text="Platform plugins"/>.

Expand Down Expand Up @@ -141,4 +141,4 @@ plugins to customize the Kibana experience. Examples of extension points are:

## Follow up material

Learn how to build your own plugin by following <DocLink id="kibDevTutorialBuildAPlugin" />
Learn how to build your own plugin by following <DocLink id="kibDevTutorialBuildAPlugin" />.
30 changes: 30 additions & 0 deletions dev_docs/kibana_server_core_components.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
---
id: kibServerAndCoreComponents
slug: /kibana-dev-docs/core-intro
title: Core components
summary: An introduction to the Kibana server and core components.
date: 2021-02-26
tags: ['kibana','onboarding', 'dev', 'architecture']
---

Core is a set of systems (frontend, backend etc.) that Kibana and its plugins are built on top of.

## Integration with the "legacy" Kibana

Most of the existing core functionality is still spread over "legacy" Kibana and it will take some time to upgrade it.
Kibana is started using existing "legacy" CLI that bootstraps `core` which in turn creates the "legacy" Kibana server.
At the moment `core` manages HTTP connections, handles TLS configuration and base path proxy. All requests to Kibana server
will hit HTTP server exposed by the `core` first and it will decide whether request can be solely handled by the new
platform or request should be proxied to the "legacy" Kibana. This setup allows `core` to gradually introduce any "pre-route"
processing logic, expose new routes or replace old ones handled by the "legacy" Kibana currently.

Once config has been loaded and some of its parts were validated by the `core` it's passed to the "legacy" Kibana where
it will be additionally validated so that we can make config validation stricter with the new config validation system.
Even though the new validation system provided by the `core` is also based on Joi internally it is complemented with custom
rules tailored to our needs (e.g. `byteSize`, `duration` etc.). That means that config values that were previously accepted
by the "legacy" Kibana may be rejected by the `core` now.

### Logging
`core` has its own <DocLink id="kibCoreLogging" text="logging system"/> and will output log records directly (e.g. to file or terminal) when configured. When no specific configuration is provided, logs are forwarded to the "legacy" Kibana so that they look the same as the rest of the
log records throughout Kibana.

2 changes: 1 addition & 1 deletion docs/settings/task-manager-settings.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,6 @@ Task Manager runs background tasks by polling for work on an interval. You can

| `xpack.task_manager.max_workers`
| The maximum number of tasks that this Kibana instance will run simultaneously. Defaults to 10.

Starting in 8.0, it will not be possible to set the value greater than 100.

|===
4 changes: 4 additions & 0 deletions docs/user/alerting/defining-alerts.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,10 @@ Some cases exist where the variable values will be "escaped", when used in a con

Mustache also supports "triple braces" of the form `{{{variable name}}}`, which indicates no escaping should be done at all. Care should be used when using this form, as it could end up rendering the variable content in such a way as to make the resulting parameter invalid or formatted incorrectly.

Each alert type defines additional variables as properties of the variable `context`. For example, if an alert type defines a variable `value`, it can be used in an action parameter as `{{context.value}}`.

For diagnostic or exploratory purposes, action variables whose values are objects, such as `context`, can be referenced directly as variables. The resulting value will be a JSON representation of the object. For example, if an action parameter includes `{{context}}`, it will expand to the JSON representation of all the variables and values provided by the alert type.

You can attach more than one action. Clicking the "Add action" button will prompt you to select another alert type and repeat the above steps again.

[role="screenshot"]
Expand Down
4 changes: 2 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"kbn:watch": "node scripts/kibana --dev --logging.json=false",
"build:types": "rm -rf ./target/types && tsc --p tsconfig.types.json",
"docs:acceptApiChanges": "node --max-old-space-size=6144 scripts/check_published_api_changes.js --accept",
"kbn:bootstrap": "node scripts/build_ts_refs",
"kbn:bootstrap": "node scripts/build_ts_refs --ignore-type-failures",
"spec_to_console": "node scripts/spec_to_console",
"backport-skip-ci": "backport --prDescription \"[skip-ci]\"",
"storybook": "node scripts/storybook",
Expand All @@ -82,7 +82,7 @@
"**/load-grunt-config/lodash": "^4.17.21",
"**/minimist": "^1.2.5",
"**/node-jose/node-forge": "^0.10.0",
"**/prismjs": "1.22.0",
"**/prismjs": "1.23.0",
"**/react-syntax-highlighter": "^15.3.1",
"**/react-syntax-highlighter/**/highlight.js": "^10.4.1",
"**/request": "^2.88.2",
Expand Down
8 changes: 4 additions & 4 deletions src/core/CONVENTIONS.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,14 +202,14 @@ export class MyPlugin implements Plugin {
}
```

Prefer the pattern shown above, using `core.getStartServices()`, rather than store local references retrieved from `start`.
Prefer the pattern shown above, using `core.getStartServices()`, rather than store local references retrieved from `start`.

**Bad:**
```ts
export class MyPlugin implements Plugin {
// Anti pattern
private coreStart?: CoreStart;
private depsStart?: DepsStart;
private depsStart?: DepsStart;

public setup(core) {
core.application.register({
Expand All @@ -220,7 +220,7 @@ export class MyPlugin implements Plugin {
return renderApp(this.coreStart, this.depsStart, params);
}
});
}
}

public start(core, deps) {
// Anti pattern
Expand Down Expand Up @@ -361,5 +361,5 @@ Migration example from the legacy format is available in `src/core/MIGRATION_EXA

### Naming conventions

Export start and setup contracts as `MyPluginStart` and `MyPluginSetup`.
Export start and setup contracts as `MyPluginStart` and `MyPluginSetup`.
This avoids naming clashes, if everyone exported them simply as `Start` and `Setup`.
16 changes: 8 additions & 8 deletions src/core/CORE_CONVENTIONS.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,23 +15,23 @@ area of Core API's and does not apply to internal types.

- 1.1 All API types must be exported from the top-level `server` or `public`
directories.

```ts
// -- good --
import { IRouter } from 'src/core/server';

// -- bad --
import { IRouter } from 'src/core/server/http/router.ts';
```

> Why? This is required for generating documentation from our inline
> typescript doc comments, makes it easier for API consumers to find the
> relevant types and creates a clear distinction between external and
> internal types.
- 1.2 Classes must not be exposed directly. Instead, use a separate type,
prefixed with an 'I', to describe the public contract of the class.

```ts
// -- good (alternative 1) --
/**
Expand Down Expand Up @@ -66,27 +66,27 @@ area of Core API's and does not apply to internal types.
```

> Why? Classes' private members form part of their type signature making it
> impossible to mock a dependency typed as a `class`.
> impossible to mock a dependency typed as a `class`.
>
> Until we can use ES private field support in Typescript 3.8
> https://github.com/elastic/kibana/issues/54906 we have two alternatives
> each with their own pro's and cons:
>
> #### Using a derived class (alternative 1)
>
>
> Pro's:
> - TSDoc comments are located with the source code
> - The class acts as a single source of type information
>
> Con's:
> - "Go to definition" first takes you to where the type gets derived
> requiring a second "Go to definition" to navigate to the type source.
>
>
> #### Using a separate interface (alternative 2)
> Pro's:
> - Creates an explicit external API contract
> - "Go to definition" will take you directly to the type definition.
>
>
> Con's:
> - TSDoc comments are located with the interface not next to the
> implementation source code.
Expand Down
10 changes: 5 additions & 5 deletions src/core/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Core Plugin API Documentation:
- [Conventions for Plugins](./CONVENTIONS.md)
- [Testing Kibana Plugins](./TESTING.md)
- [Kibana Platform Plugin API](./docs/developer/architecture/kibana-platform-plugin-api.asciidoc )

Internal Documentation:
- [Saved Objects Migrations](./server/saved_objects/migrations/README.md)

Expand All @@ -18,18 +18,18 @@ Internal Documentation:
Most of the existing core functionality is still spread over "legacy" Kibana and it will take some time to upgrade it.
Kibana is started using existing "legacy" CLI that bootstraps `core` which in turn creates the "legacy" Kibana server.
At the moment `core` manages HTTP connections, handles TLS configuration and base path proxy. All requests to Kibana server
will hit HTTP server exposed by the `core` first and it will decide whether request can be solely handled by the new
will hit HTTP server exposed by the `core` first and it will decide whether request can be solely handled by the new
platform or request should be proxied to the "legacy" Kibana. This setup allows `core` to gradually introduce any "pre-route"
processing logic, expose new routes or replace old ones handled by the "legacy" Kibana currently.

Once config has been loaded and some of its parts were validated by the `core` it's passed to the "legacy" Kibana where
Once config has been loaded and some of its parts were validated by the `core` it's passed to the "legacy" Kibana where
it will be additionally validated so that we can make config validation stricter with the new config validation system.
Even though the new validation system provided by the `core` is also based on Joi internally it is complemented with custom
Even though the new validation system provided by the `core` is also based on Joi internally it is complemented with custom
rules tailored to our needs (e.g. `byteSize`, `duration` etc.). That means that config values that were previously accepted
by the "legacy" Kibana may be rejected by the `core` now.

### Logging
`core` has its own [logging system](./server/logging/README.mdx) and will output log records directly (e.g. to file or terminal) when configured. When no
`core` has its own [logging system](./server/logging/README.mdx) and will output log records directly (e.g. to file or terminal) when configured. When no
specific configuration is provided, logs are forwarded to the "legacy" Kibana so that they look the same as the rest of the
log records throughout Kibana.

Expand Down
30 changes: 17 additions & 13 deletions src/dev/bazel_workspace_status.js
Original file line number Diff line number Diff line change
Expand Up @@ -17,43 +17,47 @@
// If the script exits with non-zero code, it's considered as a failure
// and the output will be discarded.

(async () => {
const execa = require('execa');
(() => {
const cp = require('child_process');
const os = require('os');

async function runCmd(cmd, args) {
function runCmd(cmd, args) {
try {
return await execa(cmd, args);
const spawnResult = cp.spawnSync(cmd, args);
const exitCode = spawnResult.status !== null ? spawnResult.status : 1;
const stdoutStr = spawnResult.stdout.toString();
const stdout = stdoutStr ? stdoutStr.trim() : null;

return {
exitCode,
stdout,
};
} catch (e) {
return { exitCode: 1 };
}
}

// Git repo
const kbnGitOriginName = process.env.KBN_GIT_ORIGIN_NAME || 'origin';
const repoUrlCmdResult = await runCmd('git', [
'config',
'--get',
`remote.${kbnGitOriginName}.url`,
]);
const repoUrlCmdResult = runCmd('git', ['config', '--get', `remote.${kbnGitOriginName}.url`]);
if (repoUrlCmdResult.exitCode === 0) {
// Only output REPO_URL when found it
console.log(`REPO_URL ${repoUrlCmdResult.stdout}`);
}

// Commit SHA
const commitSHACmdResult = await runCmd('git', ['rev-parse', 'HEAD']);
const commitSHACmdResult = runCmd('git', ['rev-parse', 'HEAD']);
if (commitSHACmdResult.exitCode === 0) {
console.log(`COMMIT_SHA ${commitSHACmdResult.stdout}`);

// Branch
const gitBranchCmdResult = await runCmd('git', ['rev-parse', '--abbrev-ref', 'HEAD']);
const gitBranchCmdResult = runCmd('git', ['rev-parse', '--abbrev-ref', 'HEAD']);
if (gitBranchCmdResult.exitCode === 0) {
console.log(`GIT_BRANCH ${gitBranchCmdResult.stdout}`);
}

// Tree status
const treeStatusCmdResult = await runCmd('git', ['diff-index', '--quiet', 'HEAD', '--']);
const treeStatusCmdResult = runCmd('git', ['diff-index', '--quiet', 'HEAD', '--']);
const treeStatusVarStr = 'GIT_TREE_STATUS';
if (treeStatusCmdResult.exitCode === 0) {
console.log(`${treeStatusVarStr} Clean`);
Expand All @@ -64,7 +68,7 @@

// Host
if (process.env.CI) {
const hostCmdResult = await runCmd('hostname');
const hostCmdResult = runCmd('hostname');
const hostStr = hostCmdResult.stdout.split('-').slice(0, -1).join('-');
const coresStr = os.cpus().filter((cpu, index) => {
return !cpu.model.includes('Intel') || index % 2 === 1;
Expand Down
38 changes: 36 additions & 2 deletions src/dev/typescript/build_ts_refs_cli.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,24 @@ import { concurrentMap } from './concurrent_map';

const CACHE_WORKING_DIR = Path.resolve(REPO_ROOT, 'data/ts_refs_output_cache');

const TS_ERROR_REF = /\sTS\d{1,6}:\s/;

const isTypeFailure = (error: any) =>
error.exitCode === 1 &&
error.stderr === '' &&
typeof error.stdout === 'string' &&
TS_ERROR_REF.test(error.stdout);

export async function runBuildRefsCli() {
run(
async ({ log, flags }) => {
if (process.env.BUILD_TS_REFS_DISABLE === 'true' && !flags.force) {
log.info(
'Building ts refs is disabled because the BUILD_TS_REFS_DISABLE environment variable is set to "true". Pass `--force` to run the build anyway.'
);
return;
}

const outDirs = getOutputsDeep(REF_CONFIG_PATHS);

const cacheEnabled = process.env.BUILD_TS_REFS_CACHE_ENABLE !== 'false' && !!flags.cache;
Expand Down Expand Up @@ -48,7 +63,20 @@ export async function runBuildRefsCli() {
await outputCache.initCaches();
}

await buildAllTsRefs(log);
try {
await buildAllTsRefs(log);
log.success('ts refs build successfully');
} catch (error) {
const typeFailure = isTypeFailure(error);

if (flags['ignore-type-failures'] && typeFailure) {
log.warning(
'tsc reported type errors but we are ignoring them for now, to see them please run `node scripts/type_check` or `node scripts/build_ts_refs` without the `--ignore-type-failures` flag.'
);
} else {
throw error;
}
}

if (outputCache && doCapture) {
await outputCache.captureCache(Path.resolve(REPO_ROOT, 'target/ts_refs_cache'));
Expand All @@ -61,10 +89,16 @@ export async function runBuildRefsCli() {
{
description: 'Build TypeScript projects',
flags: {
boolean: ['clean', 'cache'],
boolean: ['clean', 'force', 'cache', 'ignore-type-failures'],
default: {
cache: true,
},
help: `
--force Run the build even if the BUILD_TS_REFS_DISABLE is set to "true"
--clean Delete outDirs for each ts project before building
--no-cache Disable fetching/extracting outDir caches based on the mergeBase with upstream
--ignore-type-failures If tsc reports type errors, ignore them and just log a small warning.
`,
},
log: {
defaultLevel: 'debug',
Expand Down
2 changes: 1 addition & 1 deletion src/plugins/maps_legacy/public/map/_legend.scss
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
.visMapLegend {
@include fontSize(11px);
@include euiBottomShadowMedium($color: $euiShadowColorLarge, $opacity: .1);
@include euiBottomShadowMedium($color: $euiShadowColorLarge);
font-family: $euiFontFamily;
font-weight: $euiFontWeightMedium;
line-height: $euiLineHeight;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,10 +30,8 @@ export function getConnections({
if (!paths) {
return [];
}
const isEnvironmentSelected =
environment && environment !== ENVIRONMENT_ALL.value;

if (serviceName || isEnvironmentSelected) {
if (serviceName || environment) {
paths = paths.filter((path) => {
return (
path
Expand All @@ -46,7 +44,7 @@ export function getConnections({
return false;
}

if (!environment) {
if (!environment || environment === ENVIRONMENT_ALL.value) {
return true;
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ import {
} from '@elastic/eui';
import { i18n } from '@kbn/i18n';
import { FormattedMessage } from '@kbn/i18n/react';
import semverLt from 'semver/functions/lt';

import { useUIExtension } from '../../../../hooks/use_ui_extension';
import { PAGE_ROUTING_PATHS, PLUGIN_ID } from '../../../../constants';
Expand Down Expand Up @@ -80,7 +81,7 @@ export function Detail() {
packageInfo &&
'savedObject' in packageInfo &&
packageInfo.savedObject &&
packageInfo.savedObject.attributes.version < packageInfo.latestVersion;
semverLt(packageInfo.savedObject.attributes.version, packageInfo.latestVersion);

// Fetch package info
const { data: packageInfoData, error: packageInfoError, isLoading } = useGetPackageInfoByKey(
Expand Down
Loading

0 comments on commit 28d55ef

Please sign in to comment.