Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure that stack monitoring Metricbeat modules are tested with the next release #28763

Closed
jsoriano opened this issue Nov 2, 2021 · 6 comments
Labels

Comments

@jsoriano
Copy link
Member

jsoriano commented Nov 2, 2021

Metricbeat modules for stack monitoring are currently tested only with the last released version, but not with the next one. This can lead to issues like #27495, where there is some issue with a Metricbeat monitoring itself.

A possible strategy is to use snapshots with an update strategy similar to the one used for the testing environments in testing/environments/snapshot.yml. Or directly use these environments. Still, there should be tests covering released versions, at least the last one to ensure that breaking changes are not introduced.

For the specific case of the beat module, it'd be good to have unit or integration tests that check that the current code in the beat module can monitor the current endpoint in beats monitoring.

@botelastic botelastic bot added the needs_team Indicates that the issue/PR needs a Team:* label label Nov 2, 2021
@elasticmachine
Copy link
Collaborator

Pinging @elastic/stack-monitoring (Stack monitoring)

@jsoriano jsoriano added the Team:Integrations Label for the Integrations team label Nov 2, 2021
@elasticmachine
Copy link
Collaborator

Pinging @elastic/integrations (Team:Integrations)

@botelastic botelastic bot removed the needs_team Indicates that the issue/PR needs a Team:* label label Nov 2, 2021
@matschaffer
Copy link
Contributor

In my recent exploration of metricbeat kibana module, I was surprised to not find anything like _meta/testdata that would help ensure module functionality across a (probably small) range of stack versions.

I tried to add it, but it doesn't look like testdata is set up to support modules that require extra endpoint requests as part of the init phase (like how kibana stats module checks the version).

Is there any plan to support response fixtures & expected payload testing for more complex modules? Seems like it could be helpful in fixing this issue as well.

@jsoriano
Copy link
Member Author

jsoriano commented Nov 5, 2021

I tried to add it, but it doesn't look like testdata is set up to support modules that require extra endpoint requests as part of the init phase (like how kibana stats module checks the version).
Is there any plan to support response fixtures & expected payload testing for more complex modules? Seems like it could be helpful in fixing this issue as well.

There have been some internal discussions about this, but we don't have a general framework. For example for rabbitmq we have a mocked server, but it doesn't support multiple versions, and so far it is specific for rabbitmq, but maybe it can be generalized.

For Cloudfoundry we also need something like this, because Cloudfoundry is expensive to manage (and forget about testing multiple versions with real deployments 🙂). I recently opened an issue about this: #28278

For system testing of some services, we have the supported-versions.yml in some modules, this file can be leveraged by system tests, as is done in the redis module for example.

@matschaffer
Copy link
Contributor

matschaffer commented Nov 8, 2021

Ah, thanks @jsoriano that's really interesting. That rabbitmq mock server looks like the sort of thing I was imagining for the kibana module last week.

I guess alternatively if we used a golang kibana (and ES and logstash) client, we could also go a similar route to the app client mock you pointed to.

I'm sure that'd be faster and lighter than a full mock http server.

@botelastic
Copy link

botelastic bot commented Nov 8, 2022

Hi!
We just realized that we haven't looked into this issue in a while. We're sorry!

We're labeling this issue as Stale to make it hit our filters and make sure we get back to it as soon as possible. In the meantime, it'd be extremely helpful if you could take a look at it as well and confirm its relevance. A simple comment with a nice emoji will be enough :+1.
Thank you for your contribution!

@botelastic botelastic bot added the Stalled label Nov 8, 2022
@botelastic botelastic bot closed this as completed May 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants