Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logstash to Remote ElasticSearch #512

Open
devilman85 opened this issue Nov 26, 2024 · 6 comments
Open

Logstash to Remote ElasticSearch #512

devilman85 opened this issue Nov 26, 2024 · 6 comments
Labels
enhancement New feature or request

Comments

@devilman85
Copy link

During the installation phase I answered this question like this
Should Malcolm use and maintain its own OpenSearch instance? (Y / n): n

1: opensearch-local - local OpenSearch
2: opensearch-remote - remote OpenSearch
3: elasticsearch-remote - remote Elasticsearch
Select primary Malcolm document store (opensearch-local): 3

Enter primary remote Elasticsearch connection URL https://192.168.1.10:9200 (in my case)

Require SSL certificate validation for communication with remote Elasticsearch instance? (y / N): n

Enter Kibana connection URL https://10.9.0.215:5601

You must run auth_setup after configure to store data store connection credentials.

But at the following I don't know what to do
Forward Logstash logs to a secondary remote document store? (y / N): ?

1: opensearch-remote - remote OpenSearch
2: elasticsearch-remote - remote Elasticsearch
Select secondary Malcolm document store: ?

Enter secondary remote OpenSearch connection URL (?)

Require SSL certificate validation for communication with secondary remote OpenSearch instance? (y / N): n

You must run auth_setup after configure to store data store connection credentials.

Can you point me to the correct steps? I ask because by doing an installation with the answer no to the question on logstash now I get the container unhealthy

@devilman85 devilman85 added the enhancement New feature or request label Nov 26, 2024
@mmguero mmguero added this to Malcolm Nov 26, 2024
@devilman85 devilman85 closed this as not planned Won't fix, can't repro, duplicate, stale Nov 26, 2024
@github-project-automation github-project-automation bot moved this to Done in Malcolm Nov 26, 2024
@devilman85 devilman85 reopened this Nov 26, 2024
@devilman85 devilman85 reopened this Nov 26, 2024
@devilman85
Copy link
Author

help me

@mmguero
Copy link
Collaborator

mmguero commented Nov 26, 2024

The secondary instance is only if you want the data sent to another elasticsearch cluster in addition to the primary one. It won't have anything to do with your log stash container's health.

I'm going to be on vacation until December 2nd, but I will follow up here when I return.

@devilman85
Copy link
Author

config output.conf

output {
elasticsearch {
id => "output_opensearch_malcolm"
hosts => "${OPENSEARCH_URL:https://192.168.1.10:9200}"
ssl_certificate_verification => "false"
user => "elastic"
password => ""
manage_template => false
index => "%{[@metadata][malcolm_opensearch_index]}"
document_id => "%{+YYMMdd}-%{[event][hash]}"
}
}

conf input_beats

GNU nano 4.8 01_beats_input.conf
input {
beats {
id => "input_beats"
host => "0.0.0.0"
port => 5044
ssl => "${BEATS_SSL:false}"
ssl_certificate_authorities => ["/certs/ca.crt"]
ssl_certificate => "/certs/server.crt"
ssl_key => "/certs/server.key"
ssl_verify_mode => "none"
}
}

conf inernal.conf
input {
pipeline {
address => "${OPENSEARCH_PIPELINE_ADDRESS_INTERNAL:internal-os}"
}
}

@piercema
Copy link
Collaborator

That all looks correct to me. What specific error are you experiencing?

@devilman85
Copy link
Author

devilman85 commented Nov 27, 2024

from log Logstash: [WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch

][WARN ][org.logstash.plugins.pipeline.AbstractPipelineBus] Attempted to send event to 'zeek-parse' but that address was unavailable. Maybe the destination pipeline is down or stopping? Will Retry.

][WARN ][logstash.inputs.beats ] You are using a deprecated config setting "ssl_verify_mode" set in beats. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Set 'ssl_client_authentication' instead. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"ssl_verify_mode", :plugin=><LogStash::Inputs::Beats ssl_certificate=>"/certs/server.crt", ssl_key=>"/certs/server.key", ssl_verify_mode=>"none", port=>5044, host=>"0.0.0.0", id=>"input_beats", ssl=>true, ssl_certificate_authorities=>["/certs/ca.crt"

not having ssl certificates in my elesticsearch cluster, is the possible cause that the malcolm-beats index is not copied?

@devilman85
Copy link
Author

now in my elastic cluster with kibana have x509 certificates. where do i put the file with the ca-authorites in malcolm?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Done
Development

No branches or pull requests

3 participants