Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docker nomad alloc logs can be spliced #1198

Closed
camerondavison opened this issue May 23, 2016 · 6 comments
Closed

docker nomad alloc logs can be spliced #1198

camerondavison opened this issue May 23, 2016 · 6 comments

Comments

@camerondavison
Copy link
Contributor

Spliced stdout log

Nomad version

nomad 0.3.2

This one seems hard to tell if it is my program, or docker, or nomad. But given that I am using java logback, I feel like it is probably something with nomad's syslog appender.

I am logging as json, and when there is heavy log writes happening, occasionally see a log line in the middle of my log line.

I am looking at the task.stdout.0 log, I would post the offending log, but it is hard to read, and rather unhelpful. The splice happens at character 2330 if that helps.

It looks like {first 2330 chars of log 1}{log 2}\n{last 3102 of log 1}\n

@camerondavison
Copy link
Contributor Author

I tested the following

job "echoer-raw-exec" {
  datacenters = ["dc1"]
  group "group" {
    task "echoer" {
      driver = "raw_exec"
      config {
        command = "echoer.bash"
      }
      artifact {
        source = "https://gist.githubusercontent.com/a86c6f7964/045da29e2cc5a59949361aab051eb805/raw/bcd81a9e598c8e4ec80fe3fec766bea9459032f6/echoer.bash"
      }
      resources {
        cpu = 20
        memory = 30
      }
    }
  }
}

And did not see any spliced logs, but I did notice that

if remainingSize < int64(len(p[n:])) {
will write partial log lines to files before rotating them. I personally think that this seems like a bad ideas. I would not want to read logs from 2 files in order to get the full log output. Maybe this would be moot if I were to just be streaming the output through nomad itself, but I think it would be a lot nicer if the file rotation did not chop logs in half.

I am working on doing the same with docker now.

@camerondavison
Copy link
Contributor Author

Inside of the vagrant ssh host that yall defined I see docker version

$ docker version
Client:
 Version:      1.11.1
 API version:  1.23
 Go version:   go1.5.4
 Git commit:   5604cbe
 Built:        Tue Apr 26 23:30:23 2016
 OS/Arch:      linux/amd64

Server:
 Version:      1.11.1
 API version:  1.23
 Go version:   go1.5.4
 Git commit:   5604cbe
 Built:        Tue Apr 26 23:30:23 2016
 OS/Arch:      linux/amd64

I ran the following

FROM alpine
RUN apk add --update bash && rm -rf /var/cache/apk/*
ADD https://gist.githubusercontent.com/a86c6f7964/045da29e2cc5a59949361aab051eb805/raw/4e7b8d762302d33fc305760ab49553998b762db7/echoer.bash\
/bin/echoer.bash
ENTRYPOINT ["bash","/bin/echoer.bash"]
docker build -t example:docker .
job "echoer-docker" {
  datacenters = ["dc1"]
  group "group" {
    task "echoer" {
      driver = "docker"
      config {
        image = "example:docker"
      }
      resources {
        cpu = 20
        memory = 30
      }
    }
  }
}

I am seeing tons of duplicate logs, and missing logs, and also a strange prefix 02:33 docker/5ebbff501753[16179]: on the front of all of the log lines. Seems like having something like #688 if anything just for a fall back would be good. Not super keen on having duplicate and missing logs.

@camerondavison
Copy link
Contributor Author

I am pretty new to golang but is the problem because the byte[] that is returned at

https://golang.org/pkg/bufio/#example_Scanner_lines says "The underlying array may point to data that will be overwritten by a subsequent call to Scan". Maybe the parser should make a copy of the bytes to be safe.
Message: line[msgIdx:],

@camerondavison
Copy link
Contributor Author

I literally finished testing #1323 when you posted that. Thanks for the quick fix. I am hoping this can make it into 0.4.0

@dadgar
Copy link
Contributor

dadgar commented Jun 20, 2016

@a86c6f7964 It will 👍

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 21, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants