Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: socket hang up #369

Closed
gartmeier opened this issue Mar 9, 2018 · 12 comments · Fixed by #376
Closed

Error: socket hang up #369

gartmeier opened this issue Mar 9, 2018 · 12 comments · Fixed by #376
Assignees

Comments

@gartmeier
Copy link

We're all using macOS in my company and after three to five minutes of using james-proxy with https the following error occurs:

Error: socket hang up
at createHangUpError (_http_client.js:302:15)
at Socket.socketOnEnd (_http_client.js:394:23)
at emitNone (events.js:91:20)
at Socket.emit (events.js:188:7)
at endReadableNT (_stream_readable.js:975:12)
at _combinedTickCallback (internal/process/next_tick.js:80:11)
at process._tickDomainCallback (internal/process/next_tick.js:128:9)

Kind Regards
Joshua

@gartmeier
Copy link
Author

Additional Information: we're not using the browser links, but configured the proxy in the network settings.

screen shot 2018-03-09 at 16 32 53

@mitchhentges
Copy link
Member

Hey, thanks for the bug report - I think that's the same one as we're seeing internally.

Two questions:

  • Can you remember doing anything in particular to trigger this bug other than just using James for a couple minutes?
  • Do you experience any bad behaviour when this bug occurs (I dunno, but maybe something like a request not showing up in James?)

@gartmeier
Copy link
Author

Hi there,

  1. No, I don't. I was opening github.com when it crashed the last time (see log.txt for more information). Is there any other log file to help you?
  2. I opened GitHub and there was no response for about 30 seconds before James crashed.

Kind Regards
Joshua

log.txt

@mitchhentges
Copy link
Member

Oh jeez, it crashed? Sounds pretty serious.
That log file is super helpful, thanks.

I'll take a look when I've got some free time

@Defilan
Copy link

Defilan commented May 1, 2018

Hey @torfeld6 and @mitchhentges , I just wanted to hop on and report that this exact same issue is happening on a Fedora 27 system too.

Error: socket hang up at createHangUpError (_http_client.js:302:15) at Socket.socketOnEnd (_http_client.js:394:23) at emitNone (events.js:91:20) at Socket.emit (events.js:188:7) at endReadableNT (_stream_readable.js:975:12) at _combinedTickCallback (internal/process/next_tick.js:80:11) at process._tickDomainCallback (internal/process/next_tick.js:128:9)

Thanks!

@mitchhentges mitchhentges self-assigned this Sep 14, 2018
@mitchhentges
Copy link
Member

Looking into this further, it seems like there's a very quick performance hit/leak once you visit more than a couple sites. My guess is that once the app hangs for long enough, whatever connection it was proxying gives up (and hangs up), and around that time the app crashes, too.

Ideally, by addressing the performance issues, this might be solved in the process

@nerdbeere
Copy link
Member

nerdbeere commented Sep 17, 2018

Also looked into this a bit and I found a fairly easy way to reproduce this:

  1. Make sure https proxying works
  2. Open a new browser (chrome in my case)
  3. Open 30+ tabs of amazon
  4. Wait for a minute or so until the crash occurs

Pro Tip™: You can select multiple chrome tabs at once by pressing shift and reload all of them via context menu

Performance is really bad so I decided to basically disable everything that's going on in james except for the actual request interception by commenting out these lines: https://github.com/james-proxy/james/blob/master/src/main/proxy.js#L127-L129
This led to a significant boost in performance but the error still occurs.

The stacktrace suggests that this error happens when hoxy sends out the request. As it turns out hoxy does not properly handle errors.

My conclusion:

  • James slows down proxy performance but it seems unrelated to the socket hang up error
  • It's unclear to me why this error occurs, maybe it's some sort of timeout
  • Hoxy needs proper error handling

@nerdbeere
Copy link
Member

nerdbeere commented Sep 17, 2018

Update: https://github.com/james-proxy/hoxy/pull/1/files seems to fix the crash.

I changed 3 things:

  1. Add error handling for http server
  2. Add error handling for http client
  3. End request as suggested in the node docs

Edit:
I still see errors but they don't crash james anymore.
I was able to generate longer stacktraces by using longjohn in hoxy:

ERROR: Error: read ETIMEDOUT
      at exports._errnoException (util.js:1050:11)
      at TLSWrap.onread (net.js:581:26)
  ---------------------------------------------
      at TLSSocket.Readable.on (_stream_readable.js:689:35)
      at tickOnSocket (_http_client.js:608:10)
      at onSocketNT (_http_client.js:638:5)
      at _combinedTickCallback (internal/process/next_tick.js:80:11)
      at process._tickDomainCallback (internal/process/next_tick.js:128:9)
  ---------------------------------------------
      at ClientRequest.onSocket (_http_client.js:626:11)
      at _http_agent.js:164:11
      at oncreate (_http_agent.js:210:5)
      at Agent.createSocket (_http_agent.js:197:5)
      at Agent.addRequest (_http_agent.js:157:10)
      at new ClientRequest (_http_client.js:212:16)
      at Object.request (http.js:26:10)
      at Object.request (https.js:206:15)
      at new ProvisionableRequest (PATH/hoxy/lib/cycle.js:141:24)
      at Cycle.callee$2$0$ (PATH/hoxy/lib/cycle.js:420:34)
      at tryCatch (PATH/hoxy/node_modules/babel-runtime/regenerator/runtime.js:72:40)
      at Generator.invoke [as _invoke] (PATH/hoxy/node_modules/babel-runtime/regenerator/runtime.js:334:22)
      at Generator.prototype.(anonymous function) [as next] (PATH/hoxy/node_modules/babel-runtime/regenerator/runtime.js:105:21)
      at onFulfilled (PATH/hoxy/node_modules/co/index.js:65:19)
      at PATH/hoxy/node_modules/co/index.js:54:5
      at Promise (<anonymous>)
      at Cycle.co (PATH/hoxy/node_modules/co/index.js:50:10)
      at Cycle._sendToServer (PATH/hoxy/lib/cycle.js:415:30)
      at Proxy.callee$3$0$ (PATH/hoxy/lib/proxy.js:271:28)
      at tryCatch (PATH/hoxy/node_modules/babel-runtime/regenerator/runtime.js:72:40)
      at Generator.invoke [as _invoke] (PATH/hoxy/node_modules/babel-runtime/regenerator/runtime.js:334:22)
      at Generator.prototype.(anonymous function) [as next] (PATH/hoxy/node_modules/babel-runtime/regenerator/runtime.js:105:21)
      at onFulfilled (PATH/hoxy/node_modules/co/index.js:65:19)
      at <anonymous>

@gartmeier
Copy link
Author

I'd like to test it, but I have no idea how to build James with the updated hoxy library. @nerdbeere is it possible to provide a macOS build?

@nerdbeere
Copy link
Member

I'd like to test it, but I have no idea how to build James with the updated hoxy library. @nerdbeere is it possible to provide a macOS build?

@torfeld6 Here's a short how-to:

Assuming that you also cloned the james repo and starting it with npm run start:

  • Go to james project root
  • npm link hoxy
  • npm start

Explanation:
npm link allows you to link dependencies locally so you can work on dependencies without the need to publish every change to npm.

@nerdbeere
Copy link
Member

@mitchhentges assuming my fixes work, what do you think would be the best course of action? Should we try to get the fixes into hoxy or start using our own fork?
We still have open PRs there from 2016.

@mitchhentges
Copy link
Member

@nerdbeere you're on fire 💯
I think that we should create a PR into hoxy "upstream" (which will hopefully eventually be merged), but we should also publish our own fork - especially if the hoxy PR hasn't been merged for our next release, which should hopefully be soon considering some of the potential improvements in James 😉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants