Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems loading a plugin with :type=>"output", :name=>"kafka" after a build #175

Closed
devopsberlin opened this issue Jan 21, 2018 · 8 comments

Comments

@devopsberlin
Copy link

I am getting below error after build and install my plugin, using docker.elastic.co/logstash/logstash-oss:6.0.0.

Made my changes and then build:

git checkout master
git pull
git checkout v7.0.7
rm logstash-output-kafka-7.0.7.gem
gem build logstash-output-kafka.gemspec

Remove old plugin and install the new one:

bin/logstash-plugin remove logstash-output-kafka
  Successfully removed logstash-output-kafka
bin/logstash-plugin install --no-verify /opt/logstash-output-kafka-7.0.7.gem
  Installing logstash-output-kafka
  Installation successful
bin/logstash-plugin list | grep logstash-output-kafka
  logstash-output-kafka

Error:

[2018-01-21T20:36:04,657][ERROR][logstash.plugins.registry] Problems loading a plugin with {:type=>"output", :name=>"kafka", :path=>"logstash/outputs/kafka", :error_message=>"Could not find jar files under /usr/share/logstash/vendor/local_gems/24975053/logstash-output-kafka-7.0.7/vendor/jar-dependencies/runtime-jars/*.jar", :error_class=>LogStash::EnvironmentError, :error_backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/environment.rb:143:in `find_jars'", "/usr/share/logstash/logstash-core/lib/logstash/environment.rb:136:in `load_jars!'", "/usr/share/logstash/logstash-core/lib/logstash/environment.rb:126:in `load_runtime_jars!'", "/usr/share/logstash/vendor/local_gems/24975053/logstash-output-kafka-7.0.7/lib/logstash-output-kafka_jars.rb:5:in `<main>'", "org/jruby/RubyKernel.java:955:in `require'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/polyglot-0.3.5/lib/polyglot.rb:65:in `require'", "/usr/share/logstash/vendor/local_gems/24975053/logstash-output-kafka-7.0.7/lib/logstash/outputs/kafka.rb:1:in `<main>'", "org/jruby/RubyKernel.java:955:in `require'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/polyglot-0.3.5/lib/polyglot.rb:65:in `require'", "/usr/share/logstash/vendor/local_gems/24975053/logstash-output-kafka-7.0.7/lib/logstash/outputs/kafka.rb:4:in `(root)'", "/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:138:in `lookup'", "/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:180:in `lookup_pipeline_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/plugin.rb:140:in `lookup'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:143:in `plugin'", "/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:1:in `<eval>'", "org/jruby/RubyKernel.java:994:in `eval'", "/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:156:in `legacy_lookup'", "(eval):180:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:82:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:215:in `block in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:35:in `block in execute'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:335:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:332:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:319:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18:in `block in interval'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in `block in execute'"]}
[2018-01-21T20:36:04,659][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::PluginLoadingError", :message=>"Couldn't find any output plugin named 'kafka'. Are you sure this is correct? Trying to load the kafka output plugin resulted in this error: Problems loading the requested plugin named kafka of type output. Error: LogStash::EnvironmentError Could not find jar files under /usr/share/logstash/vendor/local_gems/24975053/logstash-output-kafka-7.0.7/vendor/jar-dependencies/runtime-jars/*.jar", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:185:in `lookup_pipeline_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/plugin.rb:140:in `lookup'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:143:in `plugin'", "(eval):180:in `<eval>'", "org/jruby/RubyKernel.java:994:in `eval'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:82:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:215:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:35:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:335:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:332:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:319:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18:in `interval'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:362:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

Please could you kindly suggest how to fix this issue.
Thanks.

@devopsberlin devopsberlin changed the title Problems loading a plugin with :type=>"output", :name=>"kafka" after the build Problems loading a plugin with :type=>"output", :name=>"kafka" after a build Jan 21, 2018
@yaauie
Copy link

yaauie commented Jan 22, 2018

@devopsberlin before building the gem, you will need to vendor the jar dependencies; this will ensure that they get packaged within the .gem and are available on the classpath when Logstash attempts to load them.

rake vendor && gem build logstash-output-kafka.gemspec

@devopsberlin
Copy link
Author

@yaauie Thanks!

@devopsberlin
Copy link
Author

@yaauie, when running the command rake vendor && gem build logstash-output-kafka.gemspec I got the below error, can you please advise ?
Thanks

rake aborted!
LoadError: cannot load such file -- logstash/devutils/rake
/home/user/.rvm/rubies/ruby-2.4.1/lib/ruby/2.4.0/rubygems/core_ext/kernel_require.rb:55:in `require'
/home/user/.rvm/rubies/ruby-2.4.1/lib/ruby/2.4.0/rubygems/core_ext/kernel_require.rb:55:in `require'
logstash-sqs-kafka-local/logstash-output-kafka/Rakefile:1:in `<top (required)>'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/rake_module.rb:28:in `load'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/rake_module.rb:28:in `load_rakefile'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/application.rb:687:in `raw_load_rakefile'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/application.rb:96:in `block in load_rakefile'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/application.rb:178:in `standard_exception_handling'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/application.rb:95:in `load_rakefile'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/application.rb:79:in `block in run'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/application.rb:178:in `standard_exception_handling'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/lib/rake/application.rb:77:in `run'
/home/user/.rvm/gems/ruby-2.4.1@global/gems/rake-12.0.0/exe/rake:27:in `<top (required)>'
/home/user/.rvm/rubies/ruby-2.4.1/bin/rake:22:in `load'
/home/user/.rvm/rubies/ruby-2.4.1/bin/rake:22:in `<main>'

@devopsberlin devopsberlin reopened this Jan 23, 2018
@yaauie
Copy link

yaauie commented Jan 23, 2018

@devopsberlin it appears that one of the dependencies isn't on your loadpath; do you have the gem's dependencies installed? From the looks of it, the logstash-devutils dependency is missing (and others may be too).

This project, like most Ruby projects, uses bundler to manage dependencies; the following will check for the bundle command, and install the gem that provides it if it is missing:

command -v bundle || gem install bundler

Once bundle is available, you will need to invoke it from the root of the project to install the project dependencies; this will inspect the Gemfile in the project's root, resolve the dependency graph, and install them.

bundle install

Once the bundle is installed, the following should vendor the jar dependencies:

rake vendor

Once the jar dependencies are vendored, the gem can be built:

gem build logstash-output-kafka.gemspec

@yaauie
Copy link

yaauie commented Jan 23, 2018

@devopsberlin since the jar dependencies are loaded using jruby, you'll also need to be using jruby 9.1.x, with $JAVA_HOME pointing to the path of Java 8 runtime.

@devopsberlin
Copy link
Author

@yaauie, thank you for your clear and helpful explanation, it works.

If you don't mind me asking one more question, I am using docker.elastic.co/logstash/logstash-oss:6.0.0 with kafka output plugin, Logstash output kafka plugin is not pushing data into kafka when one of the kafka nodes go down or get different broker id.

1/19/2018 10:36:47 PM[2018-01-19T20:36:47,283][WARN ][org.apache.kafka.clients.NetworkClient] Connection to node 1 could not be established. Broker may not be available.
1/19/2018 11:16:44 PM[2018-01-19T21:16:44,320][INFO ][logstash.outputs.kafka   ] Sending batch to Kafka failed. Will retry after a delay. {:batch_size=>1, :failures=>1, :sleep=>0.01}
1/19/2018 11:46:44 PM[2018-01-19T21:46:44,876][INFO ][logstash.outputs.kafka   ] Sending batch to Kafka failed. Will retry after a delay. {:batch_size=>1, :failures=>1, :sleep=>0.01}

retries parameter might not help in the case because the broker id can be different from the one when logstash container start with (for example kafka brokers can change from [1,2,3] to [1,2,4].

  # If you choose to set `retries`, a value greater than zero will cause the
  # client to only retry a fixed number of times. This will result in data loss
  # if a transient error outlasts your retry count.
  #

https://www.elastic.co/guide/en/logstash/5.6/plugins-outputs-kafka.html

Is there a way to force logstash to exit / kill the process in this case, this way a new logstash container will be relaunch with the new brokers ids and the service will start properly.

So right now I have two solutions for this, first is to restart the service which is running the logstash containers each time I change the kafka nodes, or to make some changes in the plugin to kill the process when catching an error message, but I wonder maybe do you have a better soultion ?

Thanks again for all your help!

output {
  kafka {
    bootstrap_servers=> "kafka:9092"
    topic_id=> "topic"
    codec=> "json"
    message_key=> "key"
  }
  #stdout { codec => "rubydebug" }
}

Thanks

@yaauie
Copy link

yaauie commented Jan 25, 2018

@devopsberlin I’m not sure how to answer your second question, but see that you’ve already filed it as elastic/logstash#8996 — I’ll flag this with the Logstash team tomorrow and try to get an answer/fix prioritised.

@yaauie
Copy link

yaauie commented Jan 25, 2018

@yaauie, thank you for your clear and helpful explanation, it works.

Closing — original issue addressed; secondary issue filed elsewhere

@yaauie yaauie closed this as completed Jan 25, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants