Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fails to add status to ingress after running "for some time" #3180

Closed
cjohansen opened this issue Oct 4, 2018 · 5 comments · Fixed by #3267
Closed

Fails to add status to ingress after running "for some time" #3180

cjohansen opened this issue Oct 4, 2018 · 5 comments · Fixed by #3267
Labels
triage/needs-information Indicates an issue needs more information in order to work on it.

Comments

@cjohansen
Copy link

NGINX Ingress controller version: 0.18.0

Kubernetes version (use kubectl version): 1.10.3

Environment:

  • Cloud provider or hardware configuration: AWS
  • OS (e.g. from /etc/os-release): Debian GNU/Linux 8 (jessie)
  • Kernel (e.g. uname -a): 4.4.121-k8s
  • Install tools: kops

What happened

It's about a month since I set up Kubernets with the ingress controller. At the time I created a few services, and everything worked as expected. Sat down to deploy a new service today, and noticed that no load balancer was connected to the ingress. Checked the ingress controller's logs and found:

W1003 14:27:48.637661       6 controller.go:804] Service "default/move-api-service" does not have any active Endpoint.

Checked kubectl get ep and saw:

NAME               ENDPOINTS                           AGE
move-api-service   100.104.0.5:9012,100.116.0.3:9012   59m

Double and triple checked my template, deleted all the resources and retried. Unfortunately I am not entirely sure if I made any changes that matter at this point. Still no luck. Log now reads:

W1004 07:43:44.040443       6 controller.go:729] Error obtaining Endpoints for Service "default/move-api-service": no object matching key "default/move-api-service" in local store

I eventually decided that this simply was not correct, so I decided to try to kill one of two ingress pods. Shortly after the new one booted up, the ingress was properly configured. The new pod's log now reads:

I1004 08:39:07.052549       6 event.go:221] Event(v1.ObjectReference{Kind:"Ingress", Namespace:"default", Name:"move-service-web", UID:"06a10352-c7aa-11e8-b72b-0ae13d9698c2", APIVersion:"extensions/v1beta1", ResourceVersion:"6016883", FieldPath:""}): type: 'Normal' reason: 'UPDATE' Ingress default/move-service-web

What you expected to happen:

Not needing to reboot the nginx ingress controller in order for my ingresses to be correctly configured.

How to reproduce it (as minimally and precisely as possible):

Unfortunately I do not know. Possibly: leave the ingress controller running for 20+ days without any new events to respond to, then try to create a new service+ingress.

Anything else we need to know:

I'm sorry for reporting something that is probably very hard to follow up on. I'm not 100% sure this wasn't some sort of mistake on my behalf, but I'm reporting anyway since rebooting one of the ingress-nginx pods fixed the problem, which was somewhat surprising. I'm hoping someone who know the internals better might have a eureka moment from this report...

I still have one of the old pods running, and would happily provide more logs etc if needed.

@aledbf
Copy link
Member

aledbf commented Oct 8, 2018

@cjohansen if you see this behavior again, please update the issue running:

kubectl port-forward -n ingress-nginx deployment/nginx-ingress-controller 10254

and post the output of:

http://localhost:10254/debug/pprof/
http://localhost:10254/debug/pprof/goroutine?debug=1
http://localhost:10254/debug/pprof/block?debug=1

This can help us to narrow where this issue is being triggered.

@aledbf aledbf added the triage/needs-information Indicates an issue needs more information in order to work on it. label Oct 8, 2018
@cjohansen
Copy link
Author

Thanks! I haven't seen this again, but I still have one of the old (presumed "bad") pods running. Would it be any use to get the output of this now?

@aledbf
Copy link
Member

aledbf commented Oct 8, 2018

Would it be any use to get the output of this now?

Yes please :)

@cjohansen
Copy link
Author

http://localhost:10254/debug/pprof/

0 | block
-- | --
81 | goroutine
1058 | heap
0 | mutex
19 | threadcreate

http://localhost:10254/debug/pprof/goroutine?debug=1

goroutine profile: total 80
6 @ 0x42c90a 0x43c620 0x1194428 0x459d51
#	0x1194427	k8s.io/ingress-nginx/internal/watch.(*OSFileWatcher).watch.func1+0xf7	/go/src/k8s.io/ingress-nginx/internal/watch/file_watcher.go:66

6 @ 0x47dff5 0xe43dea 0x11937e1 0x1192a14 0x459d51
#	0x47dff4	syscall.Syscall6+0x4									/usr/local/go/src/syscall/asm_linux_amd64.s:44
#	0xe43de9	k8s.io/ingress-nginx/vendor/golang.org/x/sys/unix.EpollWait+0x79			/go/src/k8s.io/ingress-nginx/vendor/golang.org/x/sys/unix/zsyscall_linux_amd64.go:1520
#	0x11937e0	k8s.io/ingress-nginx/vendor/gopkg.in/fsnotify/fsnotify%2ev1.(*fdPoller).wait+0x90	/go/src/k8s.io/ingress-nginx/vendor/gopkg.in/fsnotify/fsnotify.v1/inotify_poller.go:86
#	0x1192a13	k8s.io/ingress-nginx/vendor/gopkg.in/fsnotify/fsnotify%2ev1.(*Watcher).readEvents+0x193	/go/src/k8s.io/ingress-nginx/vendor/gopkg.in/fsnotify/fsnotify.v1/inotify.go:192

5 @ 0x42c90a 0x42c9be 0x404212 0x403ecb 0xed7016 0xedc8e4 0x7eb571 0x7eb62f 0x459d51
#	0xed7015	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*sharedProcessor).run+0x45									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:425
#	0xedc8e3	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*sharedProcessor).(k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.run)-fm+0x33	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:220
#	0x7eb570	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1+0x30							/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:54
#	0x7eb62e	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1+0x4e									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71

5 @ 0x42c90a 0x42c9be 0x404212 0x403ecb 0xedb494 0x459d51
#	0xedb493	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*controller).Run.func1+0x33	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/controller.go:103

5 @ 0x42c90a 0x42c9be 0x404212 0x403f0b 0xedc0e4 0x7eb0dc 0xedc371 0x7eb6b4 0x7eae4d 0x7ead7d 0xed7a28 0xedc92a 0x7eb62f 0x459d51
#	0xedc0e3	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*processorListener).run.func1.1+0x53								/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:549
#	0x7eb0db	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.ExponentialBackoff+0x9b									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:203
#	0xedc370	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*processorListener).run.func1+0x80								/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:548
#	0x7eb6b3	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil.func1+0x53									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133
#	0x7eae4c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil+0xbc										/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:134
#	0x7ead7c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.Until+0x4c										/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88
#	0xed7a27	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*processorListener).run+0x77									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:546
#	0xedc929	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*processorListener).(k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.run)-fm+0x29	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:390
#	0x7eb62e	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1+0x4e									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71

5 @ 0x42c90a 0x42c9be 0x43e3cb 0x46da60 0x7c1d6f 0x7d9ba1 0x470ec6 0xe23c70 0xe41c75 0xe422cc 0x7ecd7e 0x459d51
#	0x43e3ca	sync.runtime_notifyListWait+0x10a										/usr/local/go/src/runtime/sema.go:510
#	0x46da5f	sync.(*Cond).Wait+0x7f												/usr/local/go/src/sync/cond.go:56
#	0x7c1d6e	k8s.io/ingress-nginx/vendor/golang.org/x/net/http2.(*pipe).Read+0x8e						/go/src/k8s.io/ingress-nginx/vendor/golang.org/x/net/http2/pipe.go:64
#	0x7d9ba0	k8s.io/ingress-nginx/vendor/golang.org/x/net/http2.transportResponseBody.Read+0xa0				/go/src/k8s.io/ingress-nginx/vendor/golang.org/x/net/http2/transport.go:1865
#	0x470ec5	io.ReadAtLeast+0x85												/usr/local/go/src/io/io.go:309
#	0xe23c6f	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/framer.(*lengthDelimitedFrameReader).Read+0x29f	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/framer/framer.go:76
#	0xe41c74	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/runtime/serializer/streaming.(*decoder).Decode+0x94		/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/runtime/serializer/streaming/streaming.go:77
#	0xe422cb	k8s.io/ingress-nginx/vendor/k8s.io/client-go/rest/watch.(*Decoder).Decode+0x7b					/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/rest/watch/decoder.go:49
#	0x7ecd7d	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive+0x12d			/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:93

5 @ 0x42c90a 0x42c9be 0x43e3cb 0x46da60 0xece799 0xecc4d0 0xedc73a 0x7eb6b4 0x7eae4d 0x7ead7d 0xecc331 0xed580f 0x459d51
#	0x43e3ca	sync.runtime_notifyListWait+0x10a															/usr/local/go/src/runtime/sema.go:510
#	0x46da5f	sync.(*Cond).Wait+0x7f																	/usr/local/go/src/sync/cond.go:56
#	0xece798	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*DeltaFIFO).Pop+0x98										/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/delta_fifo.go:431
#	0xecc4cf	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*controller).processLoop+0x3f									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/controller.go:150
#	0xedc739	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*controller).(k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.processLoop)-fm+0x29	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/controller.go:124
#	0x7eb6b3	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil.func1+0x53									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133
#	0x7eae4c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil+0xbc										/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:134
#	0x7ead7c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.Until+0x4c										/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88
#	0xecc330	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*controller).Run+0x2a0									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/controller.go:124
#	0xed580e	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*sharedIndexInformer).Run+0x43e								/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:227

5 @ 0x42c90a 0x43c620 0xed38fa 0xed3313 0xedb753 0x7eb6b4 0x7eae4d 0x7ead7d 0xed2277 0xedc6f4 0x7eb571 0x7eb62f 0x459d51
#	0xed38f9	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*Reflector).watchHandler+0x259		/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/reflector.go:373
#	0xed3312	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch+0xf62		/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/reflector.go:339
#	0xedb752	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*Reflector).Run.func1+0x32			/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/reflector.go:204
#	0x7eb6b3	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil.func1+0x53			/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133
#	0x7eae4c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil+0xbc				/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:134
#	0x7ead7c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.Until+0x4c				/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88
#	0xed2276	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*Reflector).Run+0x156				/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/reflector.go:203
#	0xedc6f3	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*Reflector).Run-fm+0x33			/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/controller.go:122
#	0x7eb570	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.(*Group).StartWithChannel.func1+0x30	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:54
#	0x7eb62e	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1+0x4e			/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71

5 @ 0x42c90a 0x43c620 0xed7824 0xedc96a 0x7eb62f 0x459d51
#	0xed7823	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*processorListener).pop+0x163									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:517
#	0xedc969	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*processorListener).(k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.pop)-fm+0x29	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/shared_informer.go:391
#	0x7eb62e	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1+0x4e									/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:71

5 @ 0x42c90a 0x43c620 0xedb952 0x459d51
#	0xedb951	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache.(*Reflector).ListAndWatch.func1+0x171	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/cache/reflector.go:278

4 @ 0x42c90a 0x42c9be 0x404212 0x403f0b 0xf023d4 0x459d51
#	0xf023d3	k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/record.(*eventBroadcasterImpl).StartEventWatcher.func1+0xa3	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/tools/record/event.go:231

3 @ 0x42c90a 0x42c9be 0x404212 0x403f0b 0x7ec499 0x459d51
#	0x7ec498	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/watch.(*Broadcaster).loop+0x58	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/watch/mux.go:207

2 @ 0x42c90a 0x42c9be 0x43e3cb 0x46da60 0x118a9cb 0x118b3cc 0x118b8bc 0x118d536 0x118deaa 0x7eb6b4 0x7eae4d 0x7ead7d 0x118ce95 0x459d51
#	0x43e3ca	sync.runtime_notifyListWait+0x10a								/usr/local/go/src/runtime/sema.go:510
#	0x46da5f	sync.(*Cond).Wait+0x7f										/usr/local/go/src/sync/cond.go:56
#	0x118a9ca	k8s.io/ingress-nginx/vendor/k8s.io/client-go/util/workqueue.(*Type).Get+0x8a			/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/util/workqueue/queue.go:124
#	0x118d535	k8s.io/ingress-nginx/internal/task.(*Queue).worker+0x65						/go/src/k8s.io/ingress-nginx/internal/task/queue.go:111
#	0x118dea9	k8s.io/ingress-nginx/internal/task.(*Queue).(k8s.io/ingress-nginx/internal/task.worker)-fm+0x29	/go/src/k8s.io/ingress-nginx/internal/task/queue.go:61
#	0x7eb6b3	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil.func1+0x53		/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133
#	0x7eae4c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil+0xbc			/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:134
#	0x7ead7c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.Until+0x4c			/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88
#	0x118ce94	k8s.io/ingress-nginx/internal/task.(*Queue).Run+0x54						/go/src/k8s.io/ingress-nginx/internal/task/queue.go:61

2 @ 0x42c90a 0x43c620 0x1189527 0x459d51
#	0x1189526	k8s.io/ingress-nginx/vendor/k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop+0x3a6	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/client-go/util/workqueue/delaying_queue.go:206

2 @ 0x42c90a 0x43c620 0x7eb859 0x459d51
#	0x7eb858	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.poller.func1.1+0x178	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:374

1 @ 0x11b2858 0x11b2660 0x11af1a4 0x11ba96d 0x11bacf1 0x780c04 0x782870 0x7838ac 0x77fc21 0x459d51
#	0x11b2857	runtime/pprof.writeRuntimeProfile+0x97	/usr/local/go/src/runtime/pprof/pprof.go:679
#	0x11b265f	runtime/pprof.writeGoroutine+0x9f	/usr/local/go/src/runtime/pprof/pprof.go:641
#	0x11af1a3	runtime/pprof.(*Profile).WriteTo+0x3e3	/usr/local/go/src/runtime/pprof/pprof.go:310
#	0x11ba96c	net/http/pprof.handler.ServeHTTP+0x20c	/usr/local/go/src/net/http/pprof/pprof.go:243
#	0x11bacf0	net/http/pprof.Index+0x1d0		/usr/local/go/src/net/http/pprof/pprof.go:254
#	0x780c03	net/http.HandlerFunc.ServeHTTP+0x43	/usr/local/go/src/net/http/server.go:1947
#	0x78286f	net/http.(*ServeMux).ServeHTTP+0x12f	/usr/local/go/src/net/http/server.go:2337
#	0x7838ab	net/http.serverHandler.ServeHTTP+0xbb	/usr/local/go/src/net/http/server.go:2694
#	0x77fc20	net/http.(*conn).serve+0x650		/usr/local/go/src/net/http/server.go:1830

1 @ 0x40f2e2 0x4413b6 0x11bb472 0x459d51
#	0x4413b5	os/signal.signal_recv+0xa5	/usr/local/go/src/runtime/sigqueue.go:139
#	0x11bb471	os/signal.loop+0x21		/usr/local/go/src/os/signal/signal_unix.go:22

1 @ 0x42c90a 0x427c2a 0x4272a7 0x492fcb 0x49304d 0x493ead 0x514ecf 0x525f4a 0x6e5e06 0x6e6310 0x6e98b0 0x4ecd48 0x470ec6 0x471038 0x7b857b 0x7b8e04 0x7d7d5e 0x7d7648 0x459d51
#	0x4272a6	internal/poll.runtime_pollWait+0x56							/usr/local/go/src/runtime/netpoll.go:173
#	0x492fca	internal/poll.(*pollDesc).wait+0x9a							/usr/local/go/src/internal/poll/fd_poll_runtime.go:85
#	0x49304c	internal/poll.(*pollDesc).waitRead+0x3c							/usr/local/go/src/internal/poll/fd_poll_runtime.go:90
#	0x493eac	internal/poll.(*FD).Read+0x17c								/usr/local/go/src/internal/poll/fd_unix.go:157
#	0x514ece	net.(*netFD).Read+0x4e									/usr/local/go/src/net/fd_unix.go:202
#	0x525f49	net.(*conn).Read+0x69									/usr/local/go/src/net/net.go:176
#	0x6e5e05	crypto/tls.(*block).readFromUntil+0x95							/usr/local/go/src/crypto/tls/conn.go:493
#	0x6e630f	crypto/tls.(*Conn).readRecord+0xdf							/usr/local/go/src/crypto/tls/conn.go:595
#	0x6e98af	crypto/tls.(*Conn).Read+0xff								/usr/local/go/src/crypto/tls/conn.go:1156
#	0x4ecd47	bufio.(*Reader).Read+0x237								/usr/local/go/src/bufio/bufio.go:216
#	0x470ec5	io.ReadAtLeast+0x85									/usr/local/go/src/io/io.go:309
#	0x471037	io.ReadFull+0x57									/usr/local/go/src/io/io.go:327
#	0x7b857a	k8s.io/ingress-nginx/vendor/golang.org/x/net/http2.readFrameHeader+0x7a			/go/src/k8s.io/ingress-nginx/vendor/golang.org/x/net/http2/frame.go:237
#	0x7b8e03	k8s.io/ingress-nginx/vendor/golang.org/x/net/http2.(*Framer).ReadFrame+0xa3		/go/src/k8s.io/ingress-nginx/vendor/golang.org/x/net/http2/frame.go:492
#	0x7d7d5d	k8s.io/ingress-nginx/vendor/golang.org/x/net/http2.(*clientConnReadLoop).run+0x8d	/go/src/k8s.io/ingress-nginx/vendor/golang.org/x/net/http2/transport.go:1603
#	0x7d7647	k8s.io/ingress-nginx/vendor/golang.org/x/net/http2.(*ClientConn).readLoop+0x67		/go/src/k8s.io/ingress-nginx/vendor/golang.org/x/net/http2/transport.go:1531

1 @ 0x42c90a 0x427c2a 0x4272a7 0x492fcb 0x49304d 0x495448 0x5157e2 0x52f87e 0x52de59 0x78576f 0x783c75 0x7839c9 0x11e8ce1 0x459d51
#	0x4272a6	internal/poll.runtime_pollWait+0x56		/usr/local/go/src/runtime/netpoll.go:173
#	0x492fca	internal/poll.(*pollDesc).wait+0x9a		/usr/local/go/src/internal/poll/fd_poll_runtime.go:85
#	0x49304c	internal/poll.(*pollDesc).waitRead+0x3c		/usr/local/go/src/internal/poll/fd_poll_runtime.go:90
#	0x495447	internal/poll.(*FD).Accept+0x1a7		/usr/local/go/src/internal/poll/fd_unix.go:372
#	0x5157e1	net.(*netFD).accept+0x41			/usr/local/go/src/net/fd_unix.go:238
#	0x52f87d	net.(*TCPListener).accept+0x2d			/usr/local/go/src/net/tcpsock_posix.go:136
#	0x52de58	net.(*TCPListener).AcceptTCP+0x48		/usr/local/go/src/net/tcpsock.go:246
#	0x78576e	net/http.tcpKeepAliveListener.Accept+0x2e	/usr/local/go/src/net/http/server.go:3216
#	0x783c74	net/http.(*Server).Serve+0x1a4			/usr/local/go/src/net/http/server.go:2770
#	0x7839c8	net/http.(*Server).ListenAndServe+0xa8		/usr/local/go/src/net/http/server.go:2711
#	0x11e8ce0	main.startHTTPServer+0x140			/go/src/k8s.io/ingress-nginx/cmd/nginx/main.go:307

1 @ 0x42c90a 0x427c2a 0x4272a7 0x492fcb 0x49304d 0x495448 0x5157e2 0x5363d2 0x534399 0xfcde48 0x459d51
#	0x4272a6	internal/poll.runtime_pollWait+0x56							/usr/local/go/src/runtime/netpoll.go:173
#	0x492fca	internal/poll.(*pollDesc).wait+0x9a							/usr/local/go/src/internal/poll/fd_poll_runtime.go:85
#	0x49304c	internal/poll.(*pollDesc).waitRead+0x3c							/usr/local/go/src/internal/poll/fd_poll_runtime.go:90
#	0x495447	internal/poll.(*FD).Accept+0x1a7							/usr/local/go/src/internal/poll/fd_unix.go:372
#	0x5157e1	net.(*netFD).accept+0x41								/usr/local/go/src/net/fd_unix.go:238
#	0x5363d1	net.(*UnixListener).accept+0x31								/usr/local/go/src/net/unixsock_posix.go:162
#	0x534398	net.(*UnixListener).Accept+0x48								/usr/local/go/src/net/unixsock.go:253
#	0xfcde47	k8s.io/ingress-nginx/internal/ingress/metric/collectors.(*SocketCollector).Start+0x37	/go/src/k8s.io/ingress-nginx/internal/ingress/metric/collectors/socket.go:302

1 @ 0x42c90a 0x42c9be 0x404212 0x403ecb 0x11e7e10 0x459d51
#	0x11e7e0f	main.handleSigterm+0x9f	/go/src/k8s.io/ingress-nginx/cmd/nginx/main.go:157

1 @ 0x42c90a 0x42c9be 0x404212 0x403f0b 0x4fa76b 0x459d51
#	0x4fa76a	k8s.io/ingress-nginx/vendor/github.com/golang/glog.(*loggingT).flushDaemon+0x8a	/go/src/k8s.io/ingress-nginx/vendor/github.com/golang/glog/glog.go:882

1 @ 0x42c90a 0x42c9be 0x404212 0x403f0b 0xfc995c 0xfcfe78 0xfd0f66 0x459d51
#	0xfc995b	k8s.io/ingress-nginx/internal/ingress/metric/collectors.nginxStatusCollector.Start+0xab	/go/src/k8s.io/ingress-nginx/internal/ingress/metric/collectors/nginx_status.go:130
#	0xfd0f65	k8s.io/ingress-nginx/internal/ingress/metric.(*collector).Start.func1+0x45		/go/src/k8s.io/ingress-nginx/internal/ingress/metric/main.go:121

1 @ 0x42c90a 0x42c9be 0x404212 0x403f0b 0xfcb84a 0xfcf90b 0x459d51
#	0xfcb849	k8s.io/ingress-nginx/internal/ingress/metric/collectors.namedProcess.Start+0xa9	/go/src/k8s.io/ingress-nginx/internal/ingress/metric/collectors/process.go:175

1 @ 0x42c90a 0x43c620 0x11a5856 0x11e79ee 0x42c4b2 0x459d51
#	0x11a5855	k8s.io/ingress-nginx/internal/ingress/controller.(*NGINXController).Start+0x345	/go/src/k8s.io/ingress-nginx/internal/ingress/controller/nginx.go:284
#	0x11e79ed	main.main+0x9cd									/go/src/k8s.io/ingress-nginx/cmd/nginx/main.go:149
#	0x42c4b1	runtime.main+0x211								/usr/local/go/src/runtime/proc.go:198

1 @ 0x42c90a 0x43c620 0x456a64 0x459d51
#	0x42c909	runtime.gopark+0x119		/usr/local/go/src/runtime/proc.go:291
#	0x43c61f	runtime.selectgo+0xe4f		/usr/local/go/src/runtime/select.go:392
#	0x456a63	runtime.ensureSigM.func1+0x1f3	/usr/local/go/src/runtime/signal_unix.go:549

1 @ 0x42c90a 0x43c620 0x7eaefa 0x7ead7d 0x459d51
#	0x7eaef9	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil+0x169	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:145
#	0x7ead7c	k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait.Until+0x4c	/go/src/k8s.io/ingress-nginx/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88

1 @ 0x42c90a 0x43c620 0xeef950 0x459d51
#	0xeef94f	k8s.io/ingress-nginx/vendor/github.com/eapache/channels.(*RingChannel).ringBuffer+0x21f	/go/src/k8s.io/ingress-nginx/vendor/github.com/eapache/channels/ring_channel.go:87

1 @ 0x47dff5 0x49ff08 0x49913c 0x4986eb 0xaa4a8c 0x11ab99b 0x459d51
#	0x47dff4	syscall.Syscall6+0x4									/usr/local/go/src/syscall/asm_linux_amd64.s:44
#	0x49ff07	os.(*Process).blockUntilWaitable+0x97							/usr/local/go/src/os/wait_waitid.go:31
#	0x49913b	os.(*Process).wait+0x3b									/usr/local/go/src/os/exec_unix.go:22
#	0x4986ea	os.(*Process).Wait+0x2a									/usr/local/go/src/os/exec.go:123
#	0xaa4a8b	os/exec.(*Cmd).Wait+0x5b								/usr/local/go/src/os/exec/exec.go:461
#	0x11ab99a	k8s.io/ingress-nginx/internal/ingress/controller.(*NGINXController).start.func1+0x2a	/go/src/k8s.io/ingress-nginx/internal/ingress/controller/nginx.go:380

1 @ 0x77a600 0x459d51
#	0x77a600	net/http.(*connReader).backgroundRead+0x0	/usr/local/go/src/net/http/server.go:667

http://localhost:10254/debug/pprof/block?debug=1

--- contention:
cycles/second=2300178951

@djmcgreal
Copy link

I also had to restart the nginx-ingress-controller. Previously it was cycling through reloads and not finding any active Endpoints, as described above.

This was on a fresh install on Docker for Mac following the installation steps on your Deploy page (mandatory and cloud-provider yamls).

Unfortunately I restarted it before reading your request for debug information.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
triage/needs-information Indicates an issue needs more information in order to work on it.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants