Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom logger doesn't work due to usage of c.Logger.SetOutput #117

Closed
develar opened this issue Aug 12, 2021 · 2 comments
Closed

Custom logger doesn't work due to usage of c.Logger.SetOutput #117

develar opened this issue Aug 12, 2021 · 2 comments
Assignees
Labels
bug Something isn't working release:v0.3.x Issues related with v0.3.x release:v0.4.x Issues related with 0.4.x

Comments

@develar
Copy link

develar commented Aug 12, 2021

cacheConfig.Logger = zap.NewStdLog(logger) doesn't work. You cannot use own logger.

The issue: Olric calls c.Logger.SetOutput(filter) and it means that Olric ignore passed custom logger output, because c.Logger.SetOutput overwrites it. As result, despite setting custom logger, standard flog is used:

2021-08-12T11:08:17.617+0200	INFO	shutdown cache server
[INFO] Closing active streams => olric.go:627

If the custom logger is set, c.LogLevel should be ignored, and passed Logger is used as-is.

@buraksezer buraksezer self-assigned this Aug 12, 2021
@buraksezer buraksezer added bug Something isn't working release:v0.3.x Issues related with v0.3.x release:v0.4.x Issues related with 0.4.x labels Aug 15, 2021
@buraksezer
Copy link
Owner

Hi @develar

Thank you for reporting this. Olric shares the same logging semantics with hashicorp/memberlist. I have just fixed the problem and tried to implement a new logger with zap. It produces the following output. It looks a bit ugly and includes some useless portions. You may need to implement a wrapper to parse the messages before writing to config.LogOutput.

I released v0.3.12 which includes fixes for #117 and #116.

=== RUN   TestOlric_StartedCallback
{"level":"info","ts":1629016830.919082,"caller":"log/log.go:191","msg":"[DEBUG] memberlist: Using dynamic bind port 56728"}
{"level":"info","ts":1629016830.919959,"caller":"log/log.go:191","msg":"[INFO] Join completed. Synced with 0 initial nodes => olric.go:304"}
{"level":"info","ts":1629016830.921113,"caller":"log/log.go:191","msg":"[INFO] Routing table has been pushed by 127.0.0.1:56727 => routing.go:518"}
{"level":"info","ts":1629016830.921365,"caller":"log/log.go:191","msg":"[INFO] The cluster coordinator has been bootstrapped => olric.go:278"}
{"level":"info","ts":1629016830.921412,"caller":"log/log.go:191","msg":"[INFO] Olric bindAddr: 127.0.0.1, bindPort: 56727 => olric.go:376"}
{"level":"info","ts":1629016830.921427,"caller":"log/log.go:191","msg":"[INFO] Memberlist bindAddr: 192.168.1.5, bindPort: 56728 => olric.go:384"}
{"level":"info","ts":1629016830.921456,"caller":"log/log.go:191","msg":"[INFO] Cluster coordinator: 127.0.0.1:56727 => olric.go:388"}
{"level":"info","ts":1629016831.033371,"caller":"log/log.go:191","msg":"[INFO] Closing active streams => olric.go:627"}
{"level":"info","ts":1629016831.033769,"caller":"log/log.go:191","msg":"[ERROR] End of the TCP connection: connection closed => server.go:221"}
{"level":"info","ts":1629016831.033932,"caller":"log/log.go:191","msg":"[INFO] Broadcasting a leave message => discovery.go:368"}
{"level":"info","ts":1629016831.034251,"caller":"log/log.go:191","msg":"[INFO] 127.0.0.1:56727 is gone => olric.go:647"}
--- PASS: TestOlric_StartedCallback (0.12s)
PASS

@develar
Copy link
Author

develar commented Aug 16, 2021

Thanks for a quick fix.

You may need to implement a wrapper to parse the messages before writing to config.LogOutput

c.Logger = log.New(&loggerWriter{logger}, "" /* prefix */, 0 /* flags */)
func (t *loggerWriter) Write(message []byte) (int, error) {
  message = bytes.TrimSpace(message)
  if len(message) < 2 {
    t.logger.Info(string(message))
  } else {
    switch message[1] {
    case 'I':
      t.logger.Info(string(message[7:]))
    case 'W':
      t.logger.Warn(string(message[7:]))
    case 'D':
      t.logger.Debug(string(message[8:]))
    default:
      t.logger.Info(string(message))
    }
  }
  return len(message), nil
}

Works for me:

{"level":"debug","msg":"memberlist: Using dynamic bind port 58415"}
{"level":"info","msg":"Join completed. Synced with 0 initial nodes => olric.go:304"}
{"level":"info","msg":"Routing table has been pushed by 127.0.0.1:58414 => routing.go:518"}
{"level":"info","msg":"The cluster coordinator has been bootstrapped => olric.go:278"}
{"level":"info","msg":"Olric bindAddr: 127.0.0.1, bindPort: 58414 => olric.go:376"}
{"level":"info","msg":"Memberlist bindAddr: 192.168.1.106, bindPort: 58415 => olric.go:384"}
{"level":"info","msg":"Cluster coordinator: 127.0.0.1:58414 => olric.go:388"}
{"level":"info","msg":"Node name in the cluster: 127.0.0.1:58414 => olric.go:605"}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working release:v0.3.x Issues related with v0.3.x release:v0.4.x Issues related with 0.4.x
Projects
None yet
Development

No branches or pull requests

2 participants