Skip to content

Commit

Permalink
Neutral naming for integrations (elastic#2545)
Browse files Browse the repository at this point in the history
  • Loading branch information
v1v authored Jan 24, 2022
1 parent eac4c40 commit 5176089
Show file tree
Hide file tree
Showing 13 changed files with 25 additions and 25 deletions.
6 changes: 3 additions & 3 deletions .ci/Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ pipeline {
}

// Publish package to the Package Storage
if (env.BRANCH_NAME == 'master') {
if (env.BRANCH_NAME == 'main') {
withCredentials([string(credentialsId: "${GITHUB_TOKEN_CREDENTIALS}", variable: 'GITHUB_TOKEN')]) {
sh(label: 'Configure Git user.name', script: 'git config --global user.name "Elastic Machine"')
sh(label: 'Configure Git user.email', script: 'git config --global user.email "[email protected]"')
Expand Down Expand Up @@ -164,8 +164,8 @@ def isPrAffected(integrationName) {
}
}

if (env.BRANCH_NAME == "master") {
echo "[${integrationName}] PR is affected: running on master branch"
if (env.BRANCH_NAME == "main") {
echo "[${integrationName}] PR is affected: running on main branch"
return true
}

Expand Down
2 changes: 1 addition & 1 deletion .ci/jobs/integrations-daily.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,6 @@
credentials-id: f6c7695a-671e-4f4f-a331-acdce44ff9ba
reference-repo: /var/lib/jenkins/.git-references/integrations.git
branches:
- master
- main
triggers:
- timed: 'H H(2-5) * * *'
2 changes: 1 addition & 1 deletion .ci/schedule-daily.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ pipeline {
NOTIFY_TO = credentials('notify-to')
PIPELINE_LOG_LEVEL = 'INFO'
SLACK_CHANNEL = "#beats-build"
INTEGRATION_JOB = 'Ingest-manager/integrations/master'
INTEGRATION_JOB = 'Ingest-manager/integrations/main'
}
options {
timeout(time: 4, unit: 'HOURS')
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/integration-checklist.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ when creating or updating a Package, Module or Dataset for an Integration.

### All changes

- [ ] Change follows the [contributing guidelines](https://github.com/elastic/integrations/blob/master/CONTRIBUTING.md)
- [ ] Change follows the [contributing guidelines](https://github.com/elastic/integrations/blob/main/CONTRIBUTING.md)
- [ ] Supported versions of the monitoring target are documented
- [ ] Supported operating systems are documented (if applicable)
- [ ] Integration or [System tests](https://github.com/elastic/elastic-package/blob/master/docs/howto/system_testing.md) exist
Expand Down
2 changes: 1 addition & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Explain here the changes you made on the PR.

## Checklist

- [ ] I have reviewed [tips for building integrations](https://github.com/elastic/integrations/blob/master/docs/tips_for_building_integrations.md) and this pull request is aligned with them.
- [ ] I have reviewed [tips for building integrations](https://github.com/elastic/integrations/blob/main/docs/tips_for_building_integrations.md) and this pull request is aligned with them.
- [ ] I have verified that all data streams collect metrics or logs.
- [ ] I have added an entry to my package's `changelog.yml` file.
- [ ] I have verified that Kibana version constraints are current according to [guidelines](https://github.com/elastic/elastic-package/blob/master/docs/howto/stack_version_support.md#when-to-update-the-condition).
Expand Down
2 changes: 1 addition & 1 deletion .mergify.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ pull_request_rules:
conditions:
- check-success=integrations/pr-merge
- check-success=CLA
- base=master
- base=main
- author~=^dependabot(|-preview)\[bot\]$
actions:
queue:
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
[![Build Status](https://beats-ci.elastic.co/job/ingest-manager/job/integrations/job/master/badge/icon)](https://beats-ci.elastic.co/job/ingest-manager/job/integrations/job/master/)
[![Build Status](https://beats-ci.elastic.co/job/ingest-manager/job/integrations/job/main/badge/icon)](https://beats-ci.elastic.co/job/ingest-manager/job/integrations/job/main/)

# Elastic Integrations

Expand Down Expand Up @@ -36,4 +36,4 @@ explore the builder tools.

## Test Coverage

[![Test Coverage Report](https://beats-ci.elastic.co/job/ingest-manager/job/integrations/job/master/cobertura/graph)](https://beats-ci.elastic.co/job/Ingest-manager/job/integrations/job/master/cobertura/)
[![Test Coverage Report](https://beats-ci.elastic.co/job/ingest-manager/job/integrations/job/main/cobertura/graph)](https://beats-ci.elastic.co/job/Ingest-manager/job/integrations/job/main/cobertura/)
8 changes: 4 additions & 4 deletions docs/definitions.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ An integration is a specific type of a _package_ defining data streams used to o

## Data stream

A data stream is logical sub-division of an Integration package, dealing with a specific type of observable aspect of the service or product being observed. For example, the `mysql` package defines a data stream for collecting metrics and another data stream for collecting server logs.
A data stream is logical sub-division of an Integration package, dealing with a specific type of observable aspect of the service or product being observed. For example, the `mysql` package defines a data stream for collecting metrics and another data stream for collecting server logs.

A data stream defines all the assets needed to create an Elasticsearch data stream, for example: index templates and ingest pipelines. These assets are loaded into Elasticsearch when a user installs a package via the Fleet UI in Kibana.

Expand All @@ -37,13 +37,13 @@ The data stream consists of:

The `_dev` directory is part of [the package spec](https://github.com/elastic/package-spec), containing development resources. These development resources cover any types of files/folders needed only at development time. This includes resources needed for testing but also includes any templates that might be used for generating documentation. In the future it could include other files/folders needed just at development time. It can be defined on the following levels:

1. the package-level `_dev` folder contains files needed to setup the testing environment for that package. This environment setup is specified via folders/files in the `_dev/deploy` folder. For example, the `apache` package [specifies](https://github.com/elastic/integrations/tree/master/packages/apache/_dev/deploy) how to spin up an Apache Docker container for testing.
1. the data stream-level `_dev` folder contains test configuration files for various types of tests. For example, see the [`_dev/test` folder](https://github.com/elastic/integrations/tree/master/packages/apache/data_stream/error/_dev/test) under the `apache/error` data stream.
1. the package-level `_dev` folder contains files needed to setup the testing environment for that package. This environment setup is specified via folders/files in the `_dev/deploy` folder. For example, the `apache` package [specifies](https://github.com/elastic/integrations/tree/main/packages/apache/_dev/deploy) how to spin up an Apache Docker container for testing.
1. the data stream-level `_dev` folder contains test configuration files for various types of tests. For example, see the [`_dev/test` folder](https://github.com/elastic/integrations/tree/main/packages/apache/data_stream/error/_dev/test) under the `apache/error` data stream.

The integrations have also [asset](https://github.com/elastic/elastic-package/blob/master/docs/howto/asset_testing.md) and [static](https://github.com/elastic/elastic-package/blob/master/docs/howto/static_testing.md) tests. They don't require config files, but configs can be used to mark them as optional.

## Migration from Beats Modules

Filebeat and Metricbeat modules can be migrated over to Elastic Integrations. When migrating over, the same module in Filebeat and Metricbeat, related to the same observed product, can be combined into a single Elastic Integration.
Filebeat and Metricbeat modules can be migrated over to Elastic Integrations. When migrating over, the same module in Filebeat and Metricbeat, related to the same observed product, can be combined into a single Elastic Integration.

[Learn more](/docs/import_from_beats.md) about how to migrate Filebeat and Metricbeat modules to Elastic Integrations.
2 changes: 1 addition & 1 deletion docs/developer_workflow_design_build_test_integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ Feel free to merge the PR once you receive an approval from the Integrations tea

### Remember to bump up the version

When the PR is merged, the CI will kick off a build job for the master branch, which can release your integration to
When the PR is merged, the CI will kick off a build job for the main branch, which can release your integration to
the package-storage. It means that it will open a PR to the [Package Storage/snapshot](https://github.com/elastic/package-storage/tree/snapshot/packages) with
the built integration if only the package version doesn't already exist in the storage (hasn't been released yet).

Expand Down
4 changes: 2 additions & 2 deletions docs/developer_workflow_fleet_ui.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Development workflow

See the Kibana docs for [how to set up your dev environment](https://github.com/elastic/kibana/blob/master/CONTRIBUTING.md#setting-up-your-development-environment), [run Elasticsearch](https://github.com/elastic/kibana/blob/master/CONTRIBUTING.md#running-elasticsearch), and [start Kibana](https://github.com/elastic/kibana/blob/master/CONTRIBUTING.md#running-kibana)
See the Kibana docs for [how to set up your dev environment](https://github.com/elastic/kibana/blob/main/CONTRIBUTING.md#setting-up-your-development-environment), [run Elasticsearch](https://github.com/elastic/kibana/blob/main/CONTRIBUTING.md#running-elasticsearch), and [start Kibana](https://github.com/elastic/kibana/blob/main/CONTRIBUTING.md#running-kibana)

One common development workflow is:

Expand Down Expand Up @@ -84,4 +84,4 @@ Ensure you provide the `-p 8220:8220` port mapping to map the Fleet Server conta

For the latest version, use `8.0.0-SNAPSHOT`. Otherwise, you can explore the available versions at https://www.docker.elastic.co/r/beats/elastic-agent.

Once the Fleet Server container is running, you should be able to treat it as if it were a local process running on `http://localhost:8220` when configuring Fleet via the UI. You can then run `elastic-agent` on your local machine directly for testing purposes.
Once the Fleet Server container is running, you should be able to treat it as if it were a local process running on `http://localhost:8220` when configuring Fleet via the UI. You can then run `elastic-agent` on your local machine directly for testing purposes.
4 changes: 2 additions & 2 deletions docs/fine_tune_integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -134,11 +134,11 @@ what's been already fixed, as the script has overridden part of it).
11. Update docs template with sample events.
The events collected by the agent slightly differ from original, Metricbeat's and Filebeat's, ones. Adjust the event
content manually basing on already migrated integrations (e.g. [MySQL integration](https://github.com/elastic/integrations/blob/master/packages/mysql/_dev/build/docs/README.md))
content manually basing on already migrated integrations (e.g. [MySQL integration](https://github.com/elastic/integrations/blob/main/packages/mysql/_dev/build/docs/README.md))
or copy them once managed to run whole setup with real agent.
12. Kibana: use `stream.data stream` field instead of `event.data stream`.

Using `stream.data stream` instead of `event.data stream` also makes queries a lot more efficient as this is a
`constant_keyword`. Make sure that dashboards in your package don't use the `event.data stream` field. If so,
simply replace them with the more efficient one.
simply replace them with the more efficient one.
8 changes: 4 additions & 4 deletions docs/import_from_beats.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Import from Beats modules

The import procedure heavily uses on the _import-beats_ script. If you are interested how does it work internally,
feel free to review the script's [README](https://github.com/elastic/integrations/tree/master/dev/import-beats/README.md).
feel free to review the script's [README](https://github.com/elastic/integrations/tree/main/dev/import-beats/README.md).

1. Create an issue in the [integrations](https://github.com/elastic/integrations) to track ongoing progress with
the integration (especially manual changes).
Expand Down Expand Up @@ -30,9 +30,9 @@ feel free to review the script's [README](https://github.com/elastic/integration
2. Kibana instance:
* used to migrate dashboards, if not available, you can skip the generation (`SKIP_KIBANA=true`)
_Hint_. There is the `elastic-package` cheat sheet available [here](https://github.com/elastic/integrations/blob/master/testing/environments/README.md).
_Hint_. There is the `elastic-package` cheat sheet available [here](https://github.com/elastic/integrations/blob/main/testing/environments/README.md).
4. Create a new branch for the integration in `integrations` repository (diverge from master).
4. Create a new branch for the integration in `integrations` repository (diverge from main).
5. Run the command: `mage ImportBeats` to start the import process (note that the import script assumes the projects checked out in step 2 are at `../{project-name}`).
The outcome of running the `import-beats` script is directory with refreshed and updated integrations.
Expand All @@ -49,4 +49,4 @@ feel free to review the script's [README](https://github.com/elastic/integration

```bash
$ PACKAGES=aws,cisco mage ImportBeats
```
```
4 changes: 2 additions & 2 deletions docs/testing_and_validation.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ integrations.
The CI job runner collects coverage data and stores them together with build artifacts. The Cobertura plugin (*Coverage Report* tab) uses these data
to visualize test coverage grouped by package, data stream and test type.

See test coverage report for the *master* branch: [link](https://beats-ci.elastic.co/job/Ingest-manager/job/integrations/job/master/cobertura/)
See test coverage report for the *main* branch: [link](https://beats-ci.elastic.co/job/Ingest-manager/job/integrations/job/main/cobertura/)

### Cobertura format vs. package domain language

Expand All @@ -122,4 +122,4 @@ We decided to make few assumptions for the Cobertura classification:

**Class** - test type (pipeline tests, system tests, etc.)

**Method** - "OK" if there are any tests present
**Method** - "OK" if there are any tests present

0 comments on commit 5176089

Please sign in to comment.