Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GRPC call failure when multiple nodes connecting to 1 seed node. #369

Closed
tegefaulkes opened this issue Apr 1, 2022 · 41 comments · Fixed by #370
Closed

GRPC call failure when multiple nodes connecting to 1 seed node. #369

tegefaulkes opened this issue Apr 1, 2022 · 41 comments · Fixed by #370
Assignees
Labels
bug Something isn't working development Standard development r&d:polykey:core activity 3 Peer to Peer Federated Hierarchy

Comments

@tegefaulkes
Copy link
Contributor

tegefaulkes commented Apr 1, 2022

Specification

There seems to be a problem with the GRPC client calls in a very specific circumstance. In the test tests/nat/noNAT.test.ts:239 'agents in different namespaces can ping each other via seed node' we are trying to connect to the same seed node from two agents. the sequence is

  1. seed1 starts
  2. seed2 starts
  3. agent1 starts
  4. agent2 fails to start.

The problem here is during the polykey.start() sequence in agent2 we are calling nodeConnectionManager.syncNodeGraph(). this is making a agent-agent GRPC call. It seems that any client GRPC call to the seed1 causes a error ErrorGRPCClientCall: Generic call error - 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error to be thrown.

This only happens within this test. trying the same procedure manually causes no problem. Replicating the circumstances using polykey agents and NodeConnectionManagers to make the GRPC calls works fine as well.

The error implies that the the stream is being closed because of a protocol error during communication. it is hard to tell where this is coming from.

We need to work out why this problem is happening.

Some things to verify;

  1. If we see WARN:ConnectionForward 127.0.0.1:53659:Client Error: Error: write EPIPE is it because the clientSocket is closed and the tlsSocket is trying to write data to it and thus emitting an EPIPE.
  2. If GRPCClient is triggering the clientSocket to be closed on the ConnectionForward
  3. Why is GRPCClient closing, and what is causing the subsequent protocol error ErrorGRPCClientCall: Generic call error - 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error.

Additional context

Tasks

  1. Explore and discuss the problem further.
  2. Fix up logger naming and handle connections to 0.0.0.0 to prevent accidental usage: GRPC call failure when multiple nodes connecting to 1 seed node. #369 (comment)
  3. Refactor how createConnection works in NCM so it's not outputting that it's creating a connection multiple times, even though it fetches it from the cache GRPC call failure when multiple nodes connecting to 1 seed node. #369 (comment)
  4. Change the withConnF error handling to use the ResourceRelease and pass the error in there, and reduce all the relevant error instances to a limited set of exceptions, preferably only those that relate to connection failures. Consider how this may be delegated by connectivity state changes though (let's avoid duplicate logic between connectivity state and exceptions there). GRPC call failure when multiple nodes connecting to 1 seed node. #369 (comment)
@tegefaulkes tegefaulkes added bug Something isn't working development Standard development labels Apr 1, 2022
@tegefaulkes tegefaulkes self-assigned this Apr 1, 2022
@CMCDragonkai
Copy link
Member

The ErrorGRPCClientCall is only constructed by grpcUtils.toError which converts ServiceError to an exception object. The code must be INTERNAL which means based on:

https://grpc.github.io/grpc/core/md_doc_statuscodes.html

It is one of:

Could not decompress, but compression algorithm supported (Server -> Client) | INTERNAL | Client
Flow-control protocol violation | INTERNAL | Both
Error parsing response proto | INTERNAL | Client

@CMCDragonkai
Copy link
Member

CMCDragonkai commented Apr 4, 2022

It's important to setup wireshark to decode GRPC frames. We have plaintext GRPC frames being transferred between the GRPC client and the forward side of the proxy. As well as the reverse side of the proxy to the GRPC server. The wireshark system should be watching those packets, and together form an HTTP2 frame and GRPC frame trace. https://wiki.wireshark.org/gRPC

We've done this before, using wireshark alone, no additional software. It's just some settings in wireshark. And I remember there being an issue for this. It's here! https://gitlab.com/MatrixAI/Employees/matrix-orientation/-/blob/master/Development/Debugging%20gRPC%20(Wireshark).md

Doing this is to find out what the exact sequence of events is. Furthermore finding out whether the RST_STREAM is sent from client to server, or server to client. We should know if the server side is also emitting an error here.

@CMCDragonkai
Copy link
Member

Additionally the exceptions that we saw in our logs are not in the same order of events as they actually occurred:

The error seen by the second regular node is WARN:ConnectionForward 127.0.0.1:53659:Client Error: Error: write EPIPE and then later ErrorGRPCClientCall: Generic call error - 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error.

This is the output from when the second agent first starts attempting to connect to the seed node until the final error (where the test fails).

INFO:agent2:INFO:ConnectionForward 0.0.0.0:48804:Started Connection Forward
INFO:agent2:INFO:ConnectionForward 0.0.0.0:48804:Composing Connection Forward
INFO:agent2:INFO:ConnectionForward 0.0.0.0:48804:Composed Connection Forward
INFO:agent2:INFO:Proxy:Handled CONNECT to 0.0.0.0:48804
INFO:agent2:INFO:clientFactory:Created GRPCClientAgent connecting to 0.0.0.0:48804
INFO:agent2:INFO:0.0.0.0:48804:Created NodeConnection
INFO:agent2:INFO:NodeConnectionManager:Creating connection to vkdenb2lkvha6u4g255mgqmcuha3dllvfe8buaq10d3du4cffkrug
INFO:agent2:INFO:0.0.0.0:48804:Destroying NodeConnection
INFO:agent2:INFO:clientFactory:Destroying GRPCClientAgent connected to 0.0.0.0:48804
INFO:agent2:INFO:clientFactory:Destroyed GRPCClientAgent connected to 0.0.0.0:48804
INFO:agent2:INFO:0.0.0.0:48804:Destroyed NodeConnection
INFO:agent2:WARN:PolykeyAgent:Failed Starting PolykeyAgent
INFO:agent2:INFO:Status:Begin Status STOPPING
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-Cbs3Sk/agent2/status.json
INFO:agent2:WARN:ConnectionForward 0.0.0.0:48804:Client Error: Error: write EPIPE
INFO:agent2:INFO:ConnectionForward 0.0.0.0:48804:Stopping Connection Forward
INFO:agent2:INFO:Status:Status is STOPPING
INFO:agent2:INFO:SessionManager:Stopping SessionManager
INFO:agent2:INFO:SessionManager:Stopped SessionManager
INFO:agent2:INFO:NotificationsManager:Stopping NotificationsManager
INFO:agent2:INFO:NotificationsManager:Stopped NotificationsManager
INFO:agent2:INFO:VaultManager:Stopping VaultManager
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:33415:Stopping Connection Reverse
INFO:agent2:INFO:EncryptedFileSystem:Stopping EncryptedFS
INFO:agent2:INFO:INodeManager:Stopping INodeManager
INFO:agent2:INFO:INodeManager:Stopped INodeManager
INFO:agent2:INFO:DB:Stopping DB
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:33415:Stopped Connection Reverse
INFO:agent2:INFO:DB:Stopped DB
INFO:agent2:INFO:EncryptedFileSystem:Stopped EncryptedFS
INFO:agent2:INFO:VaultManager:Stopped VaultManager
INFO:agent2:INFO:Discovery:Stopping Discovery
INFO:agent2:INFO:Discovery:Stopped Discovery
INFO:agent2:INFO:Proxy:Stopping Proxy Server
INFO:agent2:WARN:ConnectionForward 0.0.0.0:48804:Forward Error: ErrorConnectionEndTimeout
INFO:agent2:INFO:ConnectionForward 0.0.0.0:48804:Stopped Connection Forward
INFO:agent2:INFO:Proxy:Stopped Proxy Server
INFO:agent2:INFO:GRPCServerAgent:Stopping GRPCServer
INFO:agent2:INFO:GRPCServerAgent:Stopped GRPCServer
INFO:agent2:INFO:GRPCServerClient:Stopping GRPCServer
INFO:agent2:INFO:GRPCServerClient:Stopped GRPCServer
INFO:agent2:INFO:GestaltGraph:Stopping GestaltGraph
INFO:agent2:INFO:GestaltGraph:Stopped GestaltGraph
INFO:agent2:INFO:ACL:Stopping ACL
INFO:agent2:INFO:ACL:Stopped ACL
INFO:agent2:INFO:Sigchain:Stopping Sigchain
INFO:agent2:INFO:Sigchain:Stopped Sigchain
INFO:agent2:INFO:IdentitiesManager:Stopping IdentitiesManager
INFO:agent2:INFO:IdentitiesManager:Stopped IdentitiesManager
INFO:agent2:INFO:DB:Stopping DB
INFO:agent2:INFO:DB:Stopped DB
INFO:agent2:INFO:KeyManager:Stopping KeyManager
INFO:agent2:INFO:KeyManager:Stopped KeyManager
INFO:agent2:INFO:Schema:Stopping Schema
INFO:agent2:INFO:Schema:Stopped Schema
INFO:agent2:INFO:Status:Stopping Status
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-Cbs3Sk/agent2/status.json
INFO:agent2:INFO:Status:Status is DEAD
INFO:agent2:ErrorGRPCClientCall: Generic call error - 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error

This means we need to figure out exactly a proper trace, because it could be that the RST_STREAM exception above was actually caused temporally after another error maybe somewhat lower in the protocol stack.

@CMCDragonkai
Copy link
Member

CMCDragonkai commented Apr 4, 2022

I noticed that the:

INFO:agent2:INFO:0.0.0.0:48804:Created NodeConnection
INFO:agent2:WARN:ConnectionForward 0.0.0.0:48804:Client Error: Error: write EPIPE

The 0.0.0.0 shouldn't be there. That implies it is attempting connection to 0.0.0.0 which is a wildcard address and you cannot connect to this.

This should have been caught by the handleConnectForward to prevent attempts to try to connect to 0.0.0.0.

There's a test in Proxy.test.ts that test('connection to port 0 fails', async () => {. It should be expanded, or an additional test for connection to 0.0.0.0 fails. And also the equivalent for IPv6 as well which is [::].

However this problem still occurs even after fixing up the 0.0.0.0.

Additionally it is missing the NodeConnection class name.

@tegefaulkes

  1. Add in a check for handleConnectForward for the wild card addresses since they should not be connectible. - this could be done in Proxy.establishConnectionForward because it is used by both handleConnectForward and openConnectionForward, and it would throw a specific exception for this. Have a look at src/network/errors.ts to see what is acceptable.
  2. Add additional tests for this in Proxy.test.ts
  3. Ensure that NodeConnection.name is embedded in the NodeConnection logger.

@emmacasolin
Copy link
Contributor

emmacasolin commented Apr 4, 2022

Logging using 127.0.0.1:

INFO:seed1:INFO:PolykeyAgent:Creating PolykeyAgent
INFO:seed1:INFO:PolykeyAgent:Setting umask to 077
INFO:seed1:INFO:PolykeyAgent:Setting node path to /tmp/polykey-test-eSpCi5/seed1
INFO:seed1:INFO:Status:Starting Status
INFO:seed1:INFO:Status:Writing Status to /tmp/polykey-test-eSpCi5/seed1/status.json
INFO:seed1:INFO:Status:Status is STARTING
INFO:seed1:INFO:Schema:Creating Schema
INFO:seed1:INFO:Schema:Starting Schema
INFO:seed1:INFO:Schema:Setting state path to /tmp/polykey-test-eSpCi5/seed1/state
INFO:seed1:INFO:Schema:Started Schema
INFO:seed1:INFO:Schema:Created Schema
INFO:seed1:INFO:KeyManager:Creating KeyManager
INFO:seed1:INFO:KeyManager:Setting keys path to /tmp/polykey-test-eSpCi5/seed1/state/keys
INFO:seed1:INFO:KeyManager:Starting KeyManager
INFO:seed1:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/seed1/state/keys/root.pub and /tmp/polykey-test-eSpCi5/seed1/state/keys/root.key
INFO:seed1:INFO:KeyManager:Generating root key pair
INFO:seed1:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/seed1/state/keys/root.pub and /tmp/polykey-test-eSpCi5/seed1/state/keys/root.key
INFO:seed1:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/seed1/state/keys/root.crt
INFO:seed1:INFO:KeyManager:Generating root certificate
INFO:seed1:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/seed1/state/keys/root.crt
INFO:seed1:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/seed1/state/keys/db.key
INFO:seed1:INFO:KeyManager:Generating keys db key
INFO:seed1:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/seed1/state/keys/db.key
INFO:seed1:INFO:KeyManager:Started KeyManager
INFO:seed1:INFO:KeyManager:Created KeyManager
INFO:seed1:INFO:DB:Creating DB
INFO:seed1:INFO:DB:Starting DB
INFO:seed1:INFO:DB:Setting DB path to /tmp/polykey-test-eSpCi5/seed1/state/db
INFO:seed1:INFO:DB:Started DB
INFO:seed1:INFO:DB:Created DB
INFO:seed1:INFO:IdentitiesManager:Creating IdentitiesManager
INFO:seed1:INFO:IdentitiesManager:Starting IdentitiesManager
INFO:seed1:INFO:IdentitiesManager:Started IdentitiesManager
INFO:seed1:INFO:IdentitiesManager:Created IdentitiesManager
INFO:seed1:INFO:Sigchain:Creating Sigchain
INFO:seed1:INFO:Sigchain:Starting Sigchain
INFO:seed1:INFO:Sigchain:Started Sigchain
INFO:seed1:INFO:Sigchain:Created Sigchain
INFO:seed1:INFO:ACL:Creating ACL
INFO:seed1:INFO:ACL:Starting ACL
INFO:seed1:INFO:ACL:Started ACL
INFO:seed1:INFO:ACL:Created ACL
INFO:seed1:INFO:GestaltGraph:Creating GestaltGraph
INFO:seed1:INFO:GestaltGraph:Starting GestaltGraph
INFO:seed1:INFO:GestaltGraph:Started GestaltGraph
INFO:seed1:INFO:GestaltGraph:Created GestaltGraph
INFO:seed1:INFO:Proxy:Creating Proxy
INFO:seed1:INFO:Proxy:Created Proxy
INFO:seed1:INFO:NodeGraph:Creating NodeGraph
INFO:seed1:INFO:NodeGraph:Starting NodeGraph
INFO:seed1:INFO:NodeGraph:Started NodeGraph
INFO:seed1:INFO:NodeGraph:Created NodeGraph
INFO:seed1:INFO:Discovery:Creating Discovery
INFO:seed1:INFO:Discovery:Starting Discovery
INFO:seed1:INFO:Discovery:Started Discovery
INFO:seed1:INFO:Discovery:Created Discovery
INFO:seed1:INFO:NotificationsManager:Creating NotificationsManager
INFO:seed1:INFO:NotificationsManager:Starting NotificationsManager
INFO:seed1:INFO:NotificationsManager:Started NotificationsManager
INFO:seed1:INFO:NotificationsManager:Created NotificationsManager
INFO:seed1:INFO:VaultManager:Creating VaultManager
INFO:seed1:INFO:VaultManager:Setting vaults path to /tmp/polykey-test-eSpCi5/seed1/state/vaults
INFO:seed1:INFO:VaultManager:Starting VaultManager
INFO:seed1:INFO:VaultManager:Generating vaults key
INFO:seed1:INFO:DB:Creating DB
INFO:seed1:INFO:DB:Starting DB
INFO:seed1:INFO:DB:Setting DB path to /tmp/polykey-test-eSpCi5/seed1/state/vaults/efs
INFO:seed1:INFO:DB:Started DB
INFO:seed1:INFO:DB:Created DB
INFO:seed1:INFO:INodeManager:Creating INodeManager
INFO:seed1:INFO:INodeManager:Starting INodeManager
INFO:seed1:INFO:INodeManager:Started INodeManager
INFO:seed1:INFO:INodeManager:Created INodeManager
INFO:seed1:INFO:EncryptedFileSystem:Starting EncryptedFS
INFO:seed1:INFO:EncryptedFileSystem:Started EncryptedFS
INFO:seed1:INFO:VaultManager:Started VaultManager
INFO:seed1:INFO:VaultManager:Created VaultManager
INFO:seed1:INFO:SessionManager:Creating SessionManager
INFO:seed1:INFO:SessionManager:Starting SessionManager
INFO:seed1:INFO:SessionManager:Generating sessions key
INFO:seed1:INFO:SessionManager:Started SessionManager
INFO:seed1:INFO:SessionManager:Created SessionManager
INFO:seed1:INFO:PolykeyAgent:Starting PolykeyAgent
INFO:seed1:INFO:GRPCServerClient:Starting GRPCServer on 127.0.0.1:0
INFO:seed1:INFO:GRPCServerClient:Started GRPCServer on 127.0.0.1:37921
INFO:seed1:INFO:GRPCServerAgent:Starting GRPCServer on 127.0.0.1:0
INFO:seed1:INFO:GRPCServerAgent:Started GRPCServer on 127.0.0.1:38645
INFO:seed1:INFO:Proxy:Starting Forward Proxy from 127.0.0.1:0 to 127.0.0.1:0 and Reverse Proxy from 127.0.0.1:0 to 127.0.0.1:38645
INFO:seed1:INFO:Proxy:Started Forward Proxy from 127.0.0.1:46839 to 127.0.0.1:56320 and Reverse Proxy from 127.0.0.1:56320 to 127.0.0.1:38645
INFO:seed1:INFO:NodeConnectionManager:Starting NodeConnectionManager
INFO:seed1:INFO:NodeConnectionManager:Started NodeConnectionManager
INFO:seed1:INFO:Status:Finish Status STARTING
INFO:seed1:INFO:Status:Writing Status to /tmp/polykey-test-eSpCi5/seed1/status.json
INFO:seed1:INFO:Status:Status is LIVE
INFO:seed1:INFO:PolykeyAgent:Started PolykeyAgent
INFO:seed1:INFO:PolykeyAgent:Created PolykeyAgent
INFO:agent1:INFO:PolykeyAgent:Creating PolykeyAgent
INFO:agent1:INFO:PolykeyAgent:Setting umask to 077
INFO:agent1:INFO:PolykeyAgent:Setting node path to /tmp/polykey-test-eSpCi5/agent1
INFO:agent1:INFO:Status:Starting Status
INFO:agent1:INFO:Status:Writing Status to /tmp/polykey-test-eSpCi5/agent1/status.json
INFO:agent1:INFO:Status:Status is STARTING
INFO:agent1:INFO:Schema:Creating Schema
INFO:agent1:INFO:Schema:Starting Schema
INFO:agent1:INFO:Schema:Setting state path to /tmp/polykey-test-eSpCi5/agent1/state
INFO:agent1:INFO:Schema:Started Schema
INFO:agent1:INFO:Schema:Created Schema
INFO:agent1:INFO:KeyManager:Creating KeyManager
INFO:agent1:INFO:KeyManager:Setting keys path to /tmp/polykey-test-eSpCi5/agent1/state/keys
INFO:agent1:INFO:KeyManager:Starting KeyManager
INFO:agent1:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/agent1/state/keys/root.pub and /tmp/polykey-test-eSpCi5/agent1/state/keys/root.key
INFO:agent1:INFO:KeyManager:Generating root key pair
INFO:agent1:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/agent1/state/keys/root.pub and /tmp/polykey-test-eSpCi5/agent1/state/keys/root.key
INFO:agent1:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/agent1/state/keys/root.crt
INFO:agent1:INFO:KeyManager:Generating root certificate
INFO:agent1:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/agent1/state/keys/root.crt
INFO:agent1:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/agent1/state/keys/db.key
INFO:agent1:INFO:KeyManager:Generating keys db key
INFO:agent1:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/agent1/state/keys/db.key
INFO:agent1:INFO:KeyManager:Started KeyManager
INFO:agent1:INFO:KeyManager:Created KeyManager
INFO:agent1:INFO:DB:Creating DB
INFO:agent1:INFO:DB:Starting DB
INFO:agent1:INFO:DB:Setting DB path to /tmp/polykey-test-eSpCi5/agent1/state/db
INFO:agent1:INFO:DB:Started DB
INFO:agent1:INFO:DB:Created DB
INFO:agent1:INFO:IdentitiesManager:Creating IdentitiesManager
INFO:agent1:INFO:IdentitiesManager:Starting IdentitiesManager
INFO:agent1:INFO:IdentitiesManager:Started IdentitiesManager
INFO:agent1:INFO:IdentitiesManager:Created IdentitiesManager
INFO:agent1:INFO:Sigchain:Creating Sigchain
INFO:agent1:INFO:Sigchain:Starting Sigchain
INFO:agent1:INFO:Sigchain:Started Sigchain
INFO:agent1:INFO:Sigchain:Created Sigchain
INFO:agent1:INFO:ACL:Creating ACL
INFO:agent1:INFO:ACL:Starting ACL
INFO:agent1:INFO:ACL:Started ACL
INFO:agent1:INFO:ACL:Created ACL
INFO:agent1:INFO:GestaltGraph:Creating GestaltGraph
INFO:agent1:INFO:GestaltGraph:Starting GestaltGraph
INFO:agent1:INFO:GestaltGraph:Started GestaltGraph
INFO:agent1:INFO:GestaltGraph:Created GestaltGraph
INFO:agent1:INFO:Proxy:Creating Proxy
INFO:agent1:INFO:Proxy:Created Proxy
INFO:agent1:INFO:NodeGraph:Creating NodeGraph
INFO:agent1:INFO:NodeGraph:Starting NodeGraph
INFO:agent1:INFO:NodeGraph:Started NodeGraph
INFO:agent1:INFO:NodeGraph:Created NodeGraph
INFO:agent1:INFO:Discovery:Creating Discovery
INFO:agent1:INFO:Discovery:Starting Discovery
INFO:agent1:INFO:Discovery:Started Discovery
INFO:agent1:INFO:Discovery:Created Discovery
INFO:agent1:INFO:NotificationsManager:Creating NotificationsManager
INFO:agent1:INFO:NotificationsManager:Starting NotificationsManager
INFO:agent1:INFO:NotificationsManager:Started NotificationsManager
INFO:agent1:INFO:NotificationsManager:Created NotificationsManager
INFO:agent1:INFO:VaultManager:Creating VaultManager
INFO:agent1:INFO:VaultManager:Setting vaults path to /tmp/polykey-test-eSpCi5/agent1/state/vaults
INFO:agent1:INFO:VaultManager:Starting VaultManager
INFO:agent1:INFO:VaultManager:Generating vaults key
INFO:agent1:INFO:DB:Creating DB
INFO:agent1:INFO:DB:Starting DB
INFO:agent1:INFO:DB:Setting DB path to /tmp/polykey-test-eSpCi5/agent1/state/vaults/efs
INFO:agent1:INFO:DB:Started DB
INFO:agent1:INFO:DB:Created DB
INFO:agent1:INFO:INodeManager:Creating INodeManager
INFO:agent1:INFO:INodeManager:Starting INodeManager
INFO:agent1:INFO:INodeManager:Started INodeManager
INFO:agent1:INFO:INodeManager:Created INodeManager
INFO:agent1:INFO:EncryptedFileSystem:Starting EncryptedFS
INFO:agent1:INFO:EncryptedFileSystem:Started EncryptedFS
INFO:agent1:INFO:VaultManager:Started VaultManager
INFO:agent1:INFO:VaultManager:Created VaultManager
INFO:agent1:INFO:SessionManager:Creating SessionManager
INFO:agent1:INFO:SessionManager:Starting SessionManager
INFO:agent1:INFO:SessionManager:Generating sessions key
INFO:agent1:INFO:SessionManager:Started SessionManager
INFO:agent1:INFO:SessionManager:Created SessionManager
INFO:agent1:INFO:PolykeyAgent:Starting PolykeyAgent
INFO:agent1:INFO:GRPCServerClient:Starting GRPCServer on 127.0.0.1:0
INFO:agent1:INFO:GRPCServerClient:Started GRPCServer on 127.0.0.1:37073
INFO:agent1:INFO:GRPCServerAgent:Starting GRPCServer on 127.0.0.1:0
INFO:agent1:INFO:GRPCServerAgent:Started GRPCServer on 127.0.0.1:42647
INFO:agent1:INFO:Proxy:Starting Forward Proxy from 127.0.0.1:0 to 127.0.0.1:0 and Reverse Proxy from 127.0.0.1:0 to 127.0.0.1:42647
INFO:agent1:INFO:Proxy:Started Forward Proxy from 127.0.0.1:38279 to 127.0.0.1:47589 and Reverse Proxy from 127.0.0.1:47589 to 127.0.0.1:42647
INFO:agent1:INFO:NodeConnectionManager:Starting NodeConnectionManager
INFO:agent1:INFO:NodeConnectionManager:Started NodeConnectionManager
INFO:agent1:INFO:NodeConnectionManager:Creating connection to vebtn06gujdclgt4ep6810u8n99ecg6rq3tkgukiq2d4s6uon7nh0
INFO:agent1:INFO:NodeConnectionManager:no existing entry: creating connection to vebtn06gujdclgt4ep6810u8n99ecg6rq3tkgukiq2d4s6uon7nh0
INFO:agent1:INFO:127.0.0.1:56320:Creating NodeConnection
INFO:agent1:INFO:clientFactory:Creating GRPCClientAgent connecting to 127.0.0.1:56320
INFO:agent1:INFO:Proxy:Handling CONNECT to 127.0.0.1:56320
INFO:agent1:INFO:ConnectionForward 127.0.0.1:56320:Starting Connection Forward
INFO:seed1:INFO:Proxy:Handling connection from 127.0.0.1:47589
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:47589:Starting Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:47589:Started Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:47589:Composing Connection Reverse
INFO:agent1:INFO:ConnectionForward 127.0.0.1:56320:Started Connection Forward
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:47589:Composed Connection Reverse
INFO:seed1:INFO:Proxy:Handled connection from 127.0.0.1:47589
INFO:agent1:INFO:ConnectionForward 127.0.0.1:56320:Composing Connection Forward
INFO:agent1:INFO:ConnectionForward 127.0.0.1:56320:Composed Connection Forward
INFO:agent1:INFO:Proxy:Handled CONNECT to 127.0.0.1:56320
INFO:agent1:INFO:clientFactory:Created GRPCClientAgent connecting to 127.0.0.1:56320
INFO:agent1:INFO:127.0.0.1:56320:Created NodeConnection
INFO:agent1:INFO:NodeConnectionManager:Creating connection to vebtn06gujdclgt4ep6810u8n99ecg6rq3tkgukiq2d4s6uon7nh0
INFO:agent1:INFO:Status:Finish Status STARTING
INFO:agent1:INFO:Status:Writing Status to /tmp/polykey-test-eSpCi5/agent1/status.json
INFO:agent1:INFO:Status:Status is LIVE
INFO:agent1:INFO:PolykeyAgent:Started PolykeyAgent
INFO:agent1:INFO:PolykeyAgent:Created PolykeyAgent
INFO:agent2:INFO:PolykeyAgent:Creating PolykeyAgent
INFO:agent2:INFO:PolykeyAgent:Setting umask to 077
INFO:agent2:INFO:PolykeyAgent:Setting node path to /tmp/polykey-test-eSpCi5/agent2
INFO:agent2:INFO:Status:Starting Status
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-eSpCi5/agent2/status.json
INFO:agent2:INFO:Status:Status is STARTING
INFO:agent2:INFO:Schema:Creating Schema
INFO:agent2:INFO:Schema:Starting Schema
INFO:agent2:INFO:Schema:Setting state path to /tmp/polykey-test-eSpCi5/agent2/state
INFO:agent2:INFO:Schema:Started Schema
INFO:agent2:INFO:Schema:Created Schema
INFO:agent2:INFO:KeyManager:Creating KeyManager
INFO:agent2:INFO:KeyManager:Setting keys path to /tmp/polykey-test-eSpCi5/agent2/state/keys
INFO:agent2:INFO:KeyManager:Starting KeyManager
INFO:agent2:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/agent2/state/keys/root.pub and /tmp/polykey-test-eSpCi5/agent2/state/keys/root.key
INFO:agent2:INFO:KeyManager:Generating root key pair
INFO:agent2:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/agent2/state/keys/root.pub and /tmp/polykey-test-eSpCi5/agent2/state/keys/root.key
INFO:agent2:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/agent2/state/keys/root.crt
INFO:agent2:INFO:KeyManager:Generating root certificate
INFO:agent2:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/agent2/state/keys/root.crt
INFO:agent2:INFO:KeyManager:Checking /tmp/polykey-test-eSpCi5/agent2/state/keys/db.key
INFO:agent2:INFO:KeyManager:Generating keys db key
INFO:agent2:INFO:KeyManager:Writing /tmp/polykey-test-eSpCi5/agent2/state/keys/db.key
INFO:agent2:INFO:KeyManager:Started KeyManager
INFO:agent2:INFO:KeyManager:Created KeyManager
INFO:agent2:INFO:DB:Creating DB
INFO:agent2:INFO:DB:Starting DB
INFO:agent2:INFO:DB:Setting DB path to /tmp/polykey-test-eSpCi5/agent2/state/db
INFO:agent2:INFO:DB:Started DB
INFO:agent2:INFO:DB:Created DB
INFO:agent2:INFO:IdentitiesManager:Creating IdentitiesManager
INFO:agent2:INFO:IdentitiesManager:Starting IdentitiesManager
INFO:agent2:INFO:IdentitiesManager:Started IdentitiesManager
INFO:agent2:INFO:IdentitiesManager:Created IdentitiesManager
INFO:agent2:INFO:Sigchain:Creating Sigchain
INFO:agent2:INFO:Sigchain:Starting Sigchain
INFO:agent2:INFO:Sigchain:Started Sigchain
INFO:agent2:INFO:Sigchain:Created Sigchain
INFO:agent2:INFO:ACL:Creating ACL
INFO:agent2:INFO:ACL:Starting ACL
INFO:agent2:INFO:ACL:Started ACL
INFO:agent2:INFO:ACL:Created ACL
INFO:agent2:INFO:GestaltGraph:Creating GestaltGraph
INFO:agent2:INFO:GestaltGraph:Starting GestaltGraph
INFO:agent2:INFO:GestaltGraph:Started GestaltGraph
INFO:agent2:INFO:GestaltGraph:Created GestaltGraph
INFO:agent2:INFO:Proxy:Creating Proxy
INFO:agent2:INFO:Proxy:Created Proxy
INFO:agent2:INFO:NodeGraph:Creating NodeGraph
INFO:agent2:INFO:NodeGraph:Starting NodeGraph
INFO:agent2:INFO:NodeGraph:Started NodeGraph
INFO:agent2:INFO:NodeGraph:Created NodeGraph
INFO:agent2:INFO:Discovery:Creating Discovery
INFO:agent2:INFO:Discovery:Starting Discovery
INFO:agent2:INFO:Discovery:Started Discovery
INFO:agent2:INFO:Discovery:Created Discovery
INFO:agent2:INFO:NotificationsManager:Creating NotificationsManager
INFO:agent2:INFO:NotificationsManager:Starting NotificationsManager
INFO:agent2:INFO:NotificationsManager:Started NotificationsManager
INFO:agent2:INFO:NotificationsManager:Created NotificationsManager
INFO:agent2:INFO:VaultManager:Creating VaultManager
INFO:agent2:INFO:VaultManager:Setting vaults path to /tmp/polykey-test-eSpCi5/agent2/state/vaults
INFO:agent2:INFO:VaultManager:Starting VaultManager
INFO:agent2:INFO:VaultManager:Generating vaults key
INFO:agent2:INFO:DB:Creating DB
INFO:agent2:INFO:DB:Starting DB
INFO:agent2:INFO:DB:Setting DB path to /tmp/polykey-test-eSpCi5/agent2/state/vaults/efs
INFO:agent2:INFO:DB:Started DB
INFO:agent2:INFO:DB:Created DB
INFO:agent2:INFO:INodeManager:Creating INodeManager
INFO:agent2:INFO:INodeManager:Starting INodeManager
INFO:agent2:INFO:INodeManager:Started INodeManager
INFO:agent2:INFO:INodeManager:Created INodeManager
INFO:agent2:INFO:EncryptedFileSystem:Starting EncryptedFS
INFO:agent2:INFO:EncryptedFileSystem:Started EncryptedFS
INFO:agent2:INFO:VaultManager:Started VaultManager
INFO:agent2:INFO:VaultManager:Created VaultManager
INFO:agent2:INFO:SessionManager:Creating SessionManager
INFO:agent2:INFO:SessionManager:Starting SessionManager
INFO:agent2:INFO:SessionManager:Generating sessions key
INFO:agent2:INFO:SessionManager:Started SessionManager
INFO:agent2:INFO:SessionManager:Created SessionManager
INFO:agent2:INFO:PolykeyAgent:Starting PolykeyAgent
INFO:agent2:INFO:GRPCServerClient:Starting GRPCServer on 127.0.0.1:0
INFO:agent2:INFO:GRPCServerClient:Started GRPCServer on 127.0.0.1:33473
INFO:agent2:INFO:GRPCServerAgent:Starting GRPCServer on 127.0.0.1:0
INFO:agent2:INFO:GRPCServerAgent:Started GRPCServer on 127.0.0.1:33979
INFO:agent2:INFO:Proxy:Starting Forward Proxy from 127.0.0.1:0 to 127.0.0.1:0 and Reverse Proxy from 127.0.0.1:0 to 127.0.0.1:33979
INFO:agent2:INFO:Proxy:Started Forward Proxy from 127.0.0.1:43845 to 127.0.0.1:53009 and Reverse Proxy from 127.0.0.1:53009 to 127.0.0.1:33979
INFO:agent2:INFO:NodeConnectionManager:Starting NodeConnectionManager
INFO:agent2:INFO:NodeConnectionManager:Started NodeConnectionManager
INFO:agent2:INFO:NodeConnectionManager:Creating connection to vebtn06gujdclgt4ep6810u8n99ecg6rq3tkgukiq2d4s6uon7nh0
INFO:agent2:INFO:NodeConnectionManager:no existing entry: creating connection to vebtn06gujdclgt4ep6810u8n99ecg6rq3tkgukiq2d4s6uon7nh0
INFO:agent2:INFO:127.0.0.1:56320:Creating NodeConnection
INFO:agent2:INFO:clientFactory:Creating GRPCClientAgent connecting to 127.0.0.1:56320
INFO:agent2:INFO:Proxy:Handling CONNECT to 127.0.0.1:56320
INFO:agent2:INFO:ConnectionForward 127.0.0.1:56320:Starting Connection Forward
INFO:seed1:INFO:Proxy:Handling connection from 127.0.0.1:53009
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:53009:Starting Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:53009:Started Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:53009:Composing Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:53009:Composed Connection Reverse
INFO:seed1:INFO:Proxy:Handled connection from 127.0.0.1:53009
INFO:agent2:INFO:ConnectionForward 127.0.0.1:56320:Started Connection Forward
INFO:agent2:INFO:ConnectionForward 127.0.0.1:56320:Composing Connection Forward
INFO:agent2:INFO:ConnectionForward 127.0.0.1:56320:Composed Connection Forward
INFO:agent2:INFO:Proxy:Handled CONNECT to 127.0.0.1:56320
INFO:agent2:INFO:clientFactory:Created GRPCClientAgent connecting to 127.0.0.1:56320
INFO:agent2:INFO:127.0.0.1:56320:Created NodeConnection
INFO:agent2:INFO:NodeConnectionManager:Creating connection to vebtn06gujdclgt4ep6810u8n99ecg6rq3tkgukiq2d4s6uon7nh0
INFO:agent2:INFO:127.0.0.1:56320:Destroying NodeConnection
INFO:agent2:INFO:clientFactory:Destroying GRPCClientAgent connected to 127.0.0.1:56320
INFO:agent2:INFO:clientFactory:Destroyed GRPCClientAgent connected to 127.0.0.1:56320
INFO:agent2:INFO:127.0.0.1:56320:Destroyed NodeConnection
INFO:agent2:WARN:PolykeyAgent:Failed Starting PolykeyAgent
INFO:agent2:INFO:Status:Begin Status STOPPING
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-eSpCi5/agent2/status.json
INFO:agent2:WARN:ConnectionForward 127.0.0.1:56320:Client Error: Error: write EPIPE
INFO:agent2:INFO:ConnectionForward 127.0.0.1:56320:Stopping Connection Forward
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:53009:Stopping Connection Reverse
INFO:agent2:INFO:Status:Status is STOPPING
INFO:agent2:INFO:SessionManager:Stopping SessionManager
INFO:agent2:INFO:SessionManager:Stopped SessionManager
INFO:agent2:INFO:NotificationsManager:Stopping NotificationsManager
INFO:agent2:INFO:NotificationsManager:Stopped NotificationsManager
INFO:agent2:INFO:VaultManager:Stopping VaultManager
INFO:agent2:INFO:EncryptedFileSystem:Stopping EncryptedFS
INFO:agent2:INFO:INodeManager:Stopping INodeManager
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:53009:Stopped Connection Reverse
INFO:agent2:INFO:INodeManager:Stopped INodeManager
INFO:agent2:INFO:DB:Stopping DB
INFO:agent2:INFO:DB:Stopped DB
INFO:agent2:INFO:EncryptedFileSystem:Stopped EncryptedFS
INFO:agent2:INFO:VaultManager:Stopped VaultManager
INFO:agent2:INFO:Discovery:Stopping Discovery
INFO:agent2:INFO:Discovery:Stopped Discovery
INFO:agent2:INFO:Proxy:Stopping Proxy Server
INFO:agent2:WARN:ConnectionForward 127.0.0.1:56320:Forward Error: ErrorConnectionEndTimeout
INFO:agent2:INFO:ConnectionForward 127.0.0.1:56320:Stopped Connection Forward
INFO:agent2:INFO:Proxy:Stopped Proxy Server
INFO:agent2:INFO:GRPCServerAgent:Stopping GRPCServer
INFO:agent2:INFO:GRPCServerAgent:Stopped GRPCServer
INFO:agent2:INFO:GRPCServerClient:Stopping GRPCServer
INFO:agent2:INFO:GRPCServerClient:Stopped GRPCServer
INFO:agent2:INFO:GestaltGraph:Stopping GestaltGraph
INFO:agent2:INFO:GestaltGraph:Stopped GestaltGraph
INFO:agent2:INFO:ACL:Stopping ACL
INFO:agent2:INFO:ACL:Stopped ACL
INFO:agent2:INFO:Sigchain:Stopping Sigchain
INFO:agent2:INFO:Sigchain:Stopped Sigchain
INFO:agent2:INFO:IdentitiesManager:Stopping IdentitiesManager
INFO:agent2:INFO:IdentitiesManager:Stopped IdentitiesManager
INFO:agent2:INFO:DB:Stopping DB
INFO:agent2:INFO:DB:Stopped DB
INFO:agent2:INFO:KeyManager:Stopping KeyManager
INFO:agent2:INFO:KeyManager:Stopped KeyManager
INFO:agent2:INFO:Schema:Stopping Schema
INFO:agent2:INFO:Schema:Stopped Schema
INFO:agent2:INFO:Status:Stopping Status
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-eSpCi5/agent2/status.json
INFO:agent2:INFO:Status:Status is DEAD
INFO:agent2:ErrorGRPCClientCall: Generic call error - 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error

@CMCDragonkai
Copy link
Member

The createConnection right now is called twice and produces a log message that says that it is creating a connection. This is confusing, and creating messages should only be logged when an actual creation is done.

One can rename createConnection to getConnection to be more obvious here.

@CMCDragonkai
Copy link
Member

CMCDragonkai commented Apr 4, 2022

The NodeConnectionManager.withConnF should be making use of the ResourceRelease type that supports an exception:

type ResourceRelease = (e?: Error) => Promise<void>;

Which means it should be in the NodeConnectionManager.acquireConnection that can handle the destruction of the connection upon an exception.

However atm you're doing:

      if (
        err instanceof nodesErrors.ErrorNodeConnectionDestroyed ||
        err instanceof grpcErrors.ErrorGRPC ||
        err instanceof agentErrors.ErrorAgentClientDestroyed
      ) {
        // Error with connection, shutting connection down
        await this.destroyConnection(targetNodeId);
      }

There's too many ways for a node connection be destroyed. This should be reduced to one exception hierarchy for destroying a node connection. Atm, the ErrorGRPC itself is too broad, since there are many exceptions under that tree that doesn't mean the node connection should be destroyed. This should be reviewed further in detail, and we should have js-resources integrated asap.

Example of this in NodeConnectionManager.acquireConnection.

      return [
        async (e?: Error) => {
          release();
          if (e != null && e instanceof X) {
            // LOG a message here (this is a good place to indicate the log message for why you're about to destroy the connection
            await this.destroyConnection();
          }
        },
        connAndLock.connection,
      ];

@emmacasolin
Copy link
Contributor

Without destroying the connection:

INFO:agent2:INFO:NodeConnectionManager:Creating connection to v3j67p2s1vh66cvlh8qmtuu0b91af3jucbautn205t3ieedm4ecq0
INFO:agent2:111111111111111111
INFO:agent2:ErrorGRPCClientCall 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error
INFO:agent2:WARN:PolykeyAgent:Failed Starting PolykeyAgent
INFO:agent2:INFO:Status:Begin Status STOPPING
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-53PwsH/agent2/status.json
INFO:agent2:INFO:clientFactory:Destroying GRPCClientAgent connected to 127.0.0.1:38099
INFO:agent2:INFO:127.0.0.1:38099:Destroying NodeConnection
INFO:agent2:INFO:127.0.0.1:38099:Destroyed NodeConnection
INFO:agent2:INFO:clientFactory:Destroyed GRPCClientAgent connected to 127.0.0.1:38099
INFO:agent2:WARN:ConnectionForward 127.0.0.1:38099:Client Error: Error: write EPIPE
INFO:agent2:INFO:ConnectionForward 127.0.0.1:38099:Stopping Connection Forward
INFO:agent2:INFO:Status:Status is STOPPING
INFO:agent2:INFO:SessionManager:Stopping SessionManager
INFO:agent2:INFO:SessionManager:Stopped SessionManager
INFO:agent2:INFO:NotificationsManager:Stopping NotificationsManager
INFO:agent2:INFO:NotificationsManager:Stopped NotificationsManager
INFO:agent2:INFO:VaultManager:Stopping VaultManager
INFO:agent2:INFO:EncryptedFileSystem:Stopping EncryptedFS
INFO:agent2:INFO:INodeManager:Stopping INodeManager
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:43152:Stopping Connection Reverse
INFO:agent2:INFO:INodeManager:Stopped INodeManager
INFO:agent2:INFO:DB:Stopping DB
INFO:agent2:INFO:DB:Stopped DB
INFO:agent2:INFO:EncryptedFileSystem:Stopped EncryptedFS
INFO:agent2:INFO:VaultManager:Stopped VaultManager
INFO:agent2:INFO:Discovery:Stopping Discovery
INFO:agent2:INFO:Discovery:Stopped Discovery
INFO:agent2:INFO:Proxy:Stopping Proxy Server
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:43152:Stopped Connection Reverse
INFO:agent2:WARN:ConnectionForward 127.0.0.1:38099:Forward Error: ErrorConnectionEndTimeout
INFO:agent2:INFO:ConnectionForward 127.0.0.1:38099:Stopped Connection Forward
INFO:agent2:INFO:Proxy:Stopped Proxy Server
INFO:agent2:INFO:GRPCServerAgent:Stopping GRPCServer
INFO:agent2:INFO:GRPCServerAgent:Stopped GRPCServer
INFO:agent2:INFO:GRPCServerClient:Stopping GRPCServer
INFO:agent2:INFO:GRPCServerClient:Stopped GRPCServer
INFO:agent2:INFO:GestaltGraph:Stopping GestaltGraph
INFO:agent2:INFO:GestaltGraph:Stopped GestaltGraph
INFO:agent2:INFO:ACL:Stopping ACL
INFO:agent2:INFO:ACL:Stopped ACL
INFO:agent2:INFO:Sigchain:Stopping Sigchain
INFO:agent2:INFO:Sigchain:Stopped Sigchain
INFO:agent2:INFO:IdentitiesManager:Stopping IdentitiesManager
INFO:agent2:INFO:IdentitiesManager:Stopped IdentitiesManager
INFO:agent2:INFO:DB:Stopping DB
INFO:agent2:INFO:DB:Stopped DB
INFO:agent2:INFO:KeyManager:Stopping KeyManager
INFO:agent2:INFO:KeyManager:Stopped KeyManager
INFO:agent2:INFO:Schema:Stopping Schema
INFO:agent2:INFO:Schema:Stopped Schema
INFO:agent2:INFO:Status:Stopping Status
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-53PwsH/agent2/status.json
INFO:agent2:INFO:Status:Status is DEAD
INFO:agent2:ErrorGRPCClientCall: Generic call error - 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error

@emmacasolin
Copy link
Contributor

Debug mode:

INFO:agent2:INFO:ConnectionForward 127.0.0.1:37987:Started Connection Forward
INFO:agent2:INFO:ConnectionForward 127.0.0.1:37987:Composing Connection Forward
INFO:agent2:INFO:ConnectionForward 127.0.0.1:37987:Composed Connection Forward
INFO:agent2:INFO:Proxy:Handled CONNECT to 127.0.0.1:37987
INFO:agent2:INFO:clientFactory:Created GRPCClientAgent connecting to 127.0.0.1:37987
INFO:agent2:DEBUG:clientFactory:Watch Channel State: READY
INFO:agent2:INFO:127.0.0.1:37987:Created NodeConnection
INFO:agent2:INFO:NodeConnectionManager:Creating connection to v8cpocr72gk9h61gdaqjt3uokka344pf1msf8p40bpo338jj7tdjg
INFO:agent2:111111111111111111
INFO:agent2:DEBUG:clientFactory:Watch Channel State: IDLE
INFO:agent2:INFO:clientFactory:Destroying GRPCClientAgent connected to 127.0.0.1:37987
INFO:agent2:INFO:127.0.0.1:37987:Destroying NodeConnection
INFO:agent2:INFO:127.0.0.1:37987:Destroyed NodeConnection
INFO:agent2:INFO:clientFactory:Destroyed GRPCClientAgent connected to 127.0.0.1:37987
INFO:agent2:WARN:ConnectionForward 127.0.0.1:37987:Client Error: Error: write EPIPE
INFO:agent2:INFO:ConnectionForward 127.0.0.1:37987:Stopping Connection Forward
INFO:agent2:DEBUG:ConnectionForward 127.0.0.1:37987:Sends tlsSocket ending
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:33977:Stopping Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:33977:Stopped Connection Reverse
INFO:agent2:WARN:ConnectionForward 127.0.0.1:37987:Forward Error: ErrorConnectionEndTimeout
INFO:agent2:INFO:ConnectionForward 127.0.0.1:37987:Stopped Connection Forward
INFO:agent2:ErrorGRPCClientCall 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error
INFO:agent2:WARN:PolykeyAgent:Failed Starting PolykeyAgent
INFO:agent2:INFO:Status:Begin Status STOPPING
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-3NrGqY/agent2/status.json
INFO:agent2:INFO:Status:Status is STOPPING
INFO:agent2:INFO:SessionManager:Stopping SessionManager
INFO:agent2:INFO:SessionManager:Stopped SessionManager
INFO:agent2:INFO:NotificationsManager:Stopping NotificationsManager
INFO:agent2:INFO:NotificationsManager:Stopped NotificationsManager
INFO:agent2:INFO:VaultManager:Stopping VaultManager
INFO:agent2:INFO:EncryptedFileSystem:Stopping EncryptedFS
INFO:agent2:INFO:INodeManager:Stopping INodeManager
INFO:agent2:INFO:INodeManager:Stopped INodeManager
INFO:agent2:INFO:DB:Stopping DB
INFO:agent2:INFO:DB:Stopped DB
INFO:agent2:INFO:EncryptedFileSystem:Stopped EncryptedFS
INFO:agent2:INFO:VaultManager:Stopped VaultManager
INFO:agent2:INFO:Discovery:Stopping Discovery
INFO:agent2:INFO:Discovery:Stopped Discovery
INFO:agent2:INFO:Proxy:Stopping Proxy Server
INFO:agent2:INFO:Proxy:Stopped Proxy Server
INFO:agent2:INFO:GRPCServerAgent:Stopping GRPCServer
INFO:agent2:INFO:GRPCServerAgent:Stopped GRPCServer
INFO:agent2:INFO:GRPCServerClient:Stopping GRPCServer
INFO:agent2:INFO:GRPCServerClient:Stopped GRPCServer
INFO:agent2:INFO:GestaltGraph:Stopping GestaltGraph
INFO:agent2:INFO:GestaltGraph:Stopped GestaltGraph
INFO:agent2:INFO:ACL:Stopping ACL
INFO:agent2:INFO:ACL:Stopped ACL
INFO:agent2:INFO:Sigchain:Stopping Sigchain
INFO:agent2:INFO:Sigchain:Stopped Sigchain
INFO:agent2:INFO:IdentitiesManager:Stopping IdentitiesManager
INFO:agent2:INFO:IdentitiesManager:Stopped IdentitiesManager
INFO:agent2:INFO:DB:Stopping DB
INFO:agent2:INFO:DB:Stopped DB
INFO:agent2:INFO:KeyManager:Stopping KeyManager
INFO:agent2:INFO:KeyManager:Stopped KeyManager
INFO:agent2:INFO:Schema:Stopping Schema
INFO:agent2:INFO:Schema:Stopped Schema
INFO:agent2:INFO:Status:Stopping Status
INFO:agent2:INFO:Status:Writing Status to /tmp/polykey-test-3NrGqY/agent2/status.json
INFO:agent2:INFO:Status:Status is DEAD
INFO:agent2:ErrorGRPCClientCall: Generic call error - 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error

@emmacasolin
Copy link
Contributor

Failure on Agent 1:

INFO:seed1:INFO:PolykeyAgent:Creating PolykeyAgent
INFO:seed1:INFO:PolykeyAgent:Setting umask to 077
INFO:seed1:INFO:PolykeyAgent:Setting node path to /tmp/polykey-test-u2rtVH/seed1
INFO:seed1:INFO:Status:Starting Status
INFO:seed1:INFO:Status:Writing Status to /tmp/polykey-test-u2rtVH/seed1/status.json
INFO:seed1:INFO:Status:Status is STARTING
INFO:seed1:INFO:Schema:Creating Schema
INFO:seed1:INFO:Schema:Starting Schema
INFO:seed1:INFO:Schema:Setting state path to /tmp/polykey-test-u2rtVH/seed1/state
INFO:seed1:INFO:Schema:Started Schema
INFO:seed1:INFO:Schema:Created Schema
INFO:seed1:INFO:KeyManager:Creating KeyManager
INFO:seed1:INFO:KeyManager:Setting keys path to /tmp/polykey-test-u2rtVH/seed1/state/keys
INFO:seed1:INFO:KeyManager:Starting KeyManager
INFO:seed1:INFO:KeyManager:Checking /tmp/polykey-test-u2rtVH/seed1/state/keys/root.pub and /tmp/polykey-test-u2rtVH/seed1/state/keys/root.key
INFO:seed1:INFO:KeyManager:Generating root key pair
INFO:seed1:INFO:KeyManager:Writing /tmp/polykey-test-u2rtVH/seed1/state/keys/root.pub and /tmp/polykey-test-u2rtVH/seed1/state/keys/root.key
INFO:seed1:INFO:KeyManager:Checking /tmp/polykey-test-u2rtVH/seed1/state/keys/root.crt
INFO:seed1:INFO:KeyManager:Generating root certificate
INFO:seed1:INFO:KeyManager:Writing /tmp/polykey-test-u2rtVH/seed1/state/keys/root.crt
INFO:seed1:INFO:KeyManager:Checking /tmp/polykey-test-u2rtVH/seed1/state/keys/db.key
INFO:seed1:INFO:KeyManager:Generating keys db key
INFO:seed1:INFO:KeyManager:Writing /tmp/polykey-test-u2rtVH/seed1/state/keys/db.key
INFO:seed1:INFO:KeyManager:Started KeyManager
INFO:seed1:INFO:KeyManager:Created KeyManager
INFO:seed1:INFO:DB:Creating DB
INFO:seed1:INFO:DB:Starting DB
INFO:seed1:INFO:DB:Setting DB path to /tmp/polykey-test-u2rtVH/seed1/state/db
INFO:seed1:INFO:DB:Started DB
INFO:seed1:INFO:DB:Created DB
INFO:seed1:INFO:IdentitiesManager:Creating IdentitiesManager
INFO:seed1:INFO:IdentitiesManager:Starting IdentitiesManager
INFO:seed1:INFO:IdentitiesManager:Started IdentitiesManager
INFO:seed1:INFO:IdentitiesManager:Created IdentitiesManager
INFO:seed1:INFO:Sigchain:Creating Sigchain
INFO:seed1:INFO:Sigchain:Starting Sigchain
INFO:seed1:INFO:Sigchain:Started Sigchain
INFO:seed1:INFO:Sigchain:Created Sigchain
INFO:seed1:INFO:ACL:Creating ACL
INFO:seed1:INFO:ACL:Starting ACL
INFO:seed1:INFO:ACL:Started ACL
INFO:seed1:INFO:ACL:Created ACL
INFO:seed1:INFO:GestaltGraph:Creating GestaltGraph
INFO:seed1:INFO:GestaltGraph:Starting GestaltGraph
INFO:seed1:INFO:GestaltGraph:Started GestaltGraph
INFO:seed1:INFO:GestaltGraph:Created GestaltGraph
INFO:seed1:INFO:Proxy:Creating Proxy
INFO:seed1:INFO:Proxy:Created Proxy
INFO:seed1:INFO:NodeGraph:Creating NodeGraph
INFO:seed1:INFO:NodeGraph:Starting NodeGraph
INFO:seed1:INFO:NodeGraph:Started NodeGraph
INFO:seed1:INFO:NodeGraph:Created NodeGraph
INFO:seed1:INFO:Discovery:Creating Discovery
INFO:seed1:INFO:Discovery:Starting Discovery
INFO:seed1:INFO:Discovery:Started Discovery
INFO:seed1:INFO:Discovery:Created Discovery
INFO:seed1:INFO:NotificationsManager:Creating NotificationsManager
INFO:seed1:INFO:NotificationsManager:Starting NotificationsManager
INFO:seed1:INFO:NotificationsManager:Started NotificationsManager
INFO:seed1:INFO:NotificationsManager:Created NotificationsManager
INFO:seed1:INFO:VaultManager:Creating VaultManager
INFO:seed1:INFO:VaultManager:Setting vaults path to /tmp/polykey-test-u2rtVH/seed1/state/vaults
INFO:seed1:INFO:VaultManager:Starting VaultManager
INFO:seed1:INFO:VaultManager:Generating vaults key
INFO:seed1:INFO:DB:Creating DB
INFO:seed1:INFO:DB:Starting DB
INFO:seed1:INFO:DB:Setting DB path to /tmp/polykey-test-u2rtVH/seed1/state/vaults/efs
INFO:seed1:INFO:DB:Started DB
INFO:seed1:INFO:DB:Created DB
INFO:seed1:INFO:INodeManager:Creating INodeManager
INFO:seed1:INFO:INodeManager:Starting INodeManager
INFO:seed1:INFO:INodeManager:Started INodeManager
INFO:seed1:INFO:INodeManager:Created INodeManager
INFO:seed1:INFO:EncryptedFileSystem:Starting EncryptedFS
INFO:seed1:INFO:EncryptedFileSystem:Started EncryptedFS
INFO:seed1:INFO:VaultManager:Started VaultManager
INFO:seed1:INFO:VaultManager:Created VaultManager
INFO:seed1:INFO:SessionManager:Creating SessionManager
INFO:seed1:INFO:SessionManager:Starting SessionManager
INFO:seed1:INFO:SessionManager:Generating sessions key
INFO:seed1:INFO:SessionManager:Started SessionManager
INFO:seed1:INFO:SessionManager:Created SessionManager
INFO:seed1:INFO:PolykeyAgent:Starting PolykeyAgent
INFO:seed1:INFO:GRPCServerClient:Starting GRPCServer on 127.0.0.1:0
INFO:seed1:INFO:GRPCServerClient:Started GRPCServer on 127.0.0.1:45343
INFO:seed1:INFO:GRPCServerAgent:Starting GRPCServer on 127.0.0.1:0
INFO:seed1:INFO:GRPCServerAgent:Started GRPCServer on 127.0.0.1:35107
INFO:seed1:INFO:Proxy:Starting Forward Proxy from 127.0.0.1:0 to 127.0.0.1:0 and Reverse Proxy from 127.0.0.1:0 to 127.0.0.1:35107
INFO:seed1:INFO:Proxy:Started Forward Proxy from 127.0.0.1:42731 to 127.0.0.1:37245 and Reverse Proxy from 127.0.0.1:37245 to 127.0.0.1:35107
INFO:seed1:INFO:NodeConnectionManager:Starting NodeConnectionManager
INFO:seed1:INFO:NodeConnectionManager:Started NodeConnectionManager
INFO:seed1:INFO:Status:Finish Status STARTING
INFO:seed1:INFO:Status:Writing Status to /tmp/polykey-test-u2rtVH/seed1/status.json
INFO:seed1:INFO:Status:Status is LIVE
INFO:seed1:INFO:PolykeyAgent:Started PolykeyAgent
INFO:seed1:INFO:PolykeyAgent:Created PolykeyAgent
INFO:agent1:INFO:PolykeyAgent:Creating PolykeyAgent
INFO:agent1:INFO:PolykeyAgent:Setting umask to 077
INFO:agent1:INFO:PolykeyAgent:Setting node path to /tmp/polykey-test-u2rtVH/agent1
INFO:agent1:INFO:Status:Starting Status
INFO:agent1:INFO:Status:Writing Status to /tmp/polykey-test-u2rtVH/agent1/status.json
INFO:agent1:INFO:Status:Status is STARTING
INFO:agent1:INFO:Schema:Creating Schema
INFO:agent1:INFO:Schema:Starting Schema
INFO:agent1:INFO:Schema:Setting state path to /tmp/polykey-test-u2rtVH/agent1/state
INFO:agent1:INFO:Schema:Started Schema
INFO:agent1:INFO:Schema:Created Schema
INFO:agent1:INFO:KeyManager:Creating KeyManager
INFO:agent1:INFO:KeyManager:Setting keys path to /tmp/polykey-test-u2rtVH/agent1/state/keys
INFO:agent1:INFO:KeyManager:Starting KeyManager
INFO:agent1:INFO:KeyManager:Checking /tmp/polykey-test-u2rtVH/agent1/state/keys/root.pub and /tmp/polykey-test-u2rtVH/agent1/state/keys/root.key
INFO:agent1:INFO:KeyManager:Generating root key pair
INFO:agent1:INFO:KeyManager:Writing /tmp/polykey-test-u2rtVH/agent1/state/keys/root.pub and /tmp/polykey-test-u2rtVH/agent1/state/keys/root.key
INFO:agent1:INFO:KeyManager:Checking /tmp/polykey-test-u2rtVH/agent1/state/keys/root.crt
INFO:agent1:INFO:KeyManager:Generating root certificate
INFO:agent1:INFO:KeyManager:Writing /tmp/polykey-test-u2rtVH/agent1/state/keys/root.crt
INFO:agent1:INFO:KeyManager:Checking /tmp/polykey-test-u2rtVH/agent1/state/keys/db.key
INFO:agent1:INFO:KeyManager:Generating keys db key
INFO:agent1:INFO:KeyManager:Writing /tmp/polykey-test-u2rtVH/agent1/state/keys/db.key
INFO:agent1:INFO:KeyManager:Started KeyManager
INFO:agent1:INFO:KeyManager:Created KeyManager
INFO:agent1:INFO:DB:Creating DB
INFO:agent1:INFO:DB:Starting DB
INFO:agent1:INFO:DB:Setting DB path to /tmp/polykey-test-u2rtVH/agent1/state/db
INFO:agent1:INFO:DB:Started DB
INFO:agent1:INFO:DB:Created DB
INFO:agent1:INFO:IdentitiesManager:Creating IdentitiesManager
INFO:agent1:INFO:IdentitiesManager:Starting IdentitiesManager
INFO:agent1:INFO:IdentitiesManager:Started IdentitiesManager
INFO:agent1:INFO:IdentitiesManager:Created IdentitiesManager
INFO:agent1:INFO:Sigchain:Creating Sigchain
INFO:agent1:INFO:Sigchain:Starting Sigchain
INFO:agent1:INFO:Sigchain:Started Sigchain
INFO:agent1:INFO:Sigchain:Created Sigchain
INFO:agent1:INFO:ACL:Creating ACL
INFO:agent1:INFO:ACL:Starting ACL
INFO:agent1:INFO:ACL:Started ACL
INFO:agent1:INFO:ACL:Created ACL
INFO:agent1:INFO:GestaltGraph:Creating GestaltGraph
INFO:agent1:INFO:GestaltGraph:Starting GestaltGraph
INFO:agent1:INFO:GestaltGraph:Started GestaltGraph
INFO:agent1:INFO:GestaltGraph:Created GestaltGraph
INFO:agent1:INFO:Proxy:Creating Proxy
INFO:agent1:INFO:Proxy:Created Proxy
INFO:agent1:INFO:NodeGraph:Creating NodeGraph
INFO:agent1:INFO:NodeGraph:Starting NodeGraph
INFO:agent1:INFO:NodeGraph:Started NodeGraph
INFO:agent1:INFO:NodeGraph:Created NodeGraph
INFO:agent1:INFO:Discovery:Creating Discovery
INFO:agent1:INFO:Discovery:Starting Discovery
INFO:agent1:INFO:Discovery:Started Discovery
INFO:agent1:INFO:Discovery:Created Discovery
INFO:agent1:INFO:NotificationsManager:Creating NotificationsManager
INFO:agent1:INFO:NotificationsManager:Starting NotificationsManager
INFO:agent1:INFO:NotificationsManager:Started NotificationsManager
INFO:agent1:INFO:NotificationsManager:Created NotificationsManager
INFO:agent1:INFO:VaultManager:Creating VaultManager
INFO:agent1:INFO:VaultManager:Setting vaults path to /tmp/polykey-test-u2rtVH/agent1/state/vaults
INFO:agent1:INFO:VaultManager:Starting VaultManager
INFO:agent1:INFO:VaultManager:Generating vaults key
INFO:agent1:INFO:DB:Creating DB
INFO:agent1:INFO:DB:Starting DB
INFO:agent1:INFO:DB:Setting DB path to /tmp/polykey-test-u2rtVH/agent1/state/vaults/efs
INFO:agent1:INFO:DB:Started DB
INFO:agent1:INFO:DB:Created DB
INFO:agent1:INFO:INodeManager:Creating INodeManager
INFO:agent1:INFO:INodeManager:Starting INodeManager
INFO:agent1:INFO:INodeManager:Started INodeManager
INFO:agent1:INFO:INodeManager:Created INodeManager
INFO:agent1:INFO:EncryptedFileSystem:Starting EncryptedFS
INFO:agent1:INFO:EncryptedFileSystem:Started EncryptedFS
INFO:agent1:INFO:VaultManager:Started VaultManager
INFO:agent1:INFO:VaultManager:Created VaultManager
INFO:agent1:INFO:SessionManager:Creating SessionManager
INFO:agent1:INFO:SessionManager:Starting SessionManager
INFO:agent1:INFO:SessionManager:Generating sessions key
INFO:agent1:INFO:SessionManager:Started SessionManager
INFO:agent1:INFO:SessionManager:Created SessionManager
INFO:agent1:INFO:PolykeyAgent:Starting PolykeyAgent
INFO:agent1:INFO:GRPCServerClient:Starting GRPCServer on 127.0.0.1:0
INFO:agent1:INFO:GRPCServerClient:Started GRPCServer on 127.0.0.1:34231
INFO:agent1:INFO:GRPCServerAgent:Starting GRPCServer on 127.0.0.1:0
INFO:agent1:INFO:GRPCServerAgent:Started GRPCServer on 127.0.0.1:41607
INFO:agent1:INFO:Proxy:Starting Forward Proxy from 127.0.0.1:0 to 127.0.0.1:0 and Reverse Proxy from 127.0.0.1:0 to 127.0.0.1:41607
INFO:agent1:INFO:Proxy:Started Forward Proxy from 127.0.0.1:40403 to 127.0.0.1:58749 and Reverse Proxy from 127.0.0.1:58749 to 127.0.0.1:41607
INFO:agent1:INFO:NodeConnectionManager:Starting NodeConnectionManager
INFO:agent1:INFO:NodeConnectionManager:Started NodeConnectionManager
INFO:agent1:INFO:NodeConnectionManager:Creating connection to v2pl6bvuks4rd5d87qg7pkai7pmmgpku92f1i7uo0hoclak4g6l2g
INFO:agent1:INFO:NodeConnectionManager:no existing entry: creating connection to v2pl6bvuks4rd5d87qg7pkai7pmmgpku92f1i7uo0hoclak4g6l2g
INFO:agent1:INFO:127.0.0.1:37245:Creating NodeConnection
INFO:agent1:INFO:clientFactory:Creating GRPCClientAgent connecting to 127.0.0.1:37245
INFO:agent1:INFO:Proxy:Handling CONNECT to 127.0.0.1:37245
INFO:agent1:INFO:ConnectionForward 127.0.0.1:37245:Starting Connection Forward
INFO:seed1:INFO:Proxy:Handling connection from 127.0.0.1:58749
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:58749:Starting Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:58749:Started Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:58749:Composing Connection Reverse
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:58749:Composed Connection Reverse
INFO:seed1:INFO:Proxy:Handled connection from 127.0.0.1:58749
INFO:agent1:INFO:ConnectionForward 127.0.0.1:37245:Started Connection Forward
INFO:agent1:INFO:ConnectionForward 127.0.0.1:37245:Composing Connection Forward
INFO:agent1:INFO:ConnectionForward 127.0.0.1:37245:Composed Connection Forward
INFO:agent1:INFO:Proxy:Handled CONNECT to 127.0.0.1:37245
INFO:agent1:INFO:clientFactory:Created GRPCClientAgent connecting to 127.0.0.1:37245
INFO:agent1:DEBUG:clientFactory:Watch Channel State: READY
INFO:agent1:INFO:127.0.0.1:37245:Created NodeConnection
INFO:agent1:INFO:NodeConnectionManager:Creating connection to v2pl6bvuks4rd5d87qg7pkai7pmmgpku92f1i7uo0hoclak4g6l2g
INFO:agent1:111111111111111111
INFO:agent1:DEBUG:clientFactory:Watch Channel State: IDLE
INFO:agent1:INFO:clientFactory:Destroying GRPCClientAgent connected to 127.0.0.1:37245
INFO:agent1:INFO:127.0.0.1:37245:Destroying NodeConnection
INFO:agent1:INFO:127.0.0.1:37245:Destroyed NodeConnection
INFO:agent1:INFO:clientFactory:Destroyed GRPCClientAgent connected to 127.0.0.1:37245
INFO:agent1:WARN:ConnectionForward 127.0.0.1:37245:Client Error: Error: write EPIPE
INFO:agent1:INFO:ConnectionForward 127.0.0.1:37245:Stopping Connection Forward
INFO:agent1:DEBUG:ConnectionForward 127.0.0.1:37245:Sends tlsSocket ending
INFO:seed1:DEBUG:ConnectionReverse 127.0.0.1:58749:Receives tlsSocket ending
INFO:seed1:DEBUG:ConnectionReverse 127.0.0.1:58749:Responds tlsSocket ending
INFO:seed1:DEBUG:ConnectionReverse 127.0.0.1:58749:Responded tlsSocket ending
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:58749:Stopping Connection Reverse
INFO:seed1:DEBUG:ConnectionReverse 127.0.0.1:58749:Sends serverSocket ending
INFO:seed1:INFO:ConnectionReverse 127.0.0.1:58749:Stopped Connection Reverse
INFO:agent1:WARN:ConnectionForward 127.0.0.1:37245:Forward Error: ErrorConnectionEndTimeout
INFO:agent1:INFO:ConnectionForward 127.0.0.1:37245:Stopped Connection Forward
INFO:agent1:ErrorGRPCClientCall 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error
INFO:agent1:WARN:PolykeyAgent:Failed Starting PolykeyAgent
INFO:agent1:INFO:Status:Begin Status STOPPING
INFO:agent1:INFO:Status:Writing Status to /tmp/polykey-test-u2rtVH/agent1/status.json
INFO:agent1:INFO:Status:Status is STOPPING
INFO:agent1:INFO:SessionManager:Stopping SessionManager
INFO:agent1:INFO:SessionManager:Stopped SessionManager
INFO:agent1:INFO:NotificationsManager:Stopping NotificationsManager
INFO:agent1:INFO:NotificationsManager:Stopped NotificationsManager
INFO:agent1:INFO:VaultManager:Stopping VaultManager
INFO:agent1:INFO:EncryptedFileSystem:Stopping EncryptedFS
INFO:agent1:INFO:INodeManager:Stopping INodeManager
INFO:agent1:INFO:INodeManager:Stopped INodeManager
INFO:agent1:INFO:DB:Stopping DB
INFO:agent1:INFO:DB:Stopped DB
INFO:agent1:INFO:EncryptedFileSystem:Stopped EncryptedFS
INFO:agent1:INFO:VaultManager:Stopped VaultManager
INFO:agent1:INFO:Discovery:Stopping Discovery
INFO:agent1:INFO:Discovery:Stopped Discovery
INFO:agent1:INFO:Proxy:Stopping Proxy Server
INFO:agent1:INFO:Proxy:Stopped Proxy Server
INFO:agent1:INFO:GRPCServerAgent:Stopping GRPCServer
INFO:agent1:INFO:GRPCServerAgent:Stopped GRPCServer
INFO:agent1:INFO:GRPCServerClient:Stopping GRPCServer
INFO:agent1:INFO:GRPCServerClient:Stopped GRPCServer
INFO:agent1:INFO:GestaltGraph:Stopping GestaltGraph
INFO:agent1:INFO:GestaltGraph:Stopped GestaltGraph
INFO:agent1:INFO:ACL:Stopping ACL
INFO:agent1:INFO:ACL:Stopped ACL
INFO:agent1:INFO:Sigchain:Stopping Sigchain
INFO:agent1:INFO:Sigchain:Stopped Sigchain
INFO:agent1:INFO:IdentitiesManager:Stopping IdentitiesManager
INFO:agent1:INFO:IdentitiesManager:Stopped IdentitiesManager
INFO:agent1:INFO:DB:Stopping DB
INFO:agent1:INFO:DB:Stopped DB
INFO:agent1:INFO:KeyManager:Stopping KeyManager
INFO:agent1:INFO:KeyManager:Stopped KeyManager
INFO:agent1:INFO:Schema:Stopping Schema
INFO:agent1:INFO:Schema:Stopped Schema
INFO:agent1:INFO:Status:Stopping Status
INFO:agent1:INFO:Status:Writing Status to /tmp/polykey-test-u2rtVH/agent1/status.json
INFO:agent1:INFO:Status:Status is DEAD
INFO:agent1:ErrorGRPCClientCall: Generic call error - 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error

@CMCDragonkai
Copy link
Member

It seems that the Client Error is happening regardless of whether we destroy the GRPC client in the watch connectivity state.

The watch connectivity state appears to trigger and go from READY to IDLE indicating the connection has been closed. Possibly from the internal GRPC client library due to the protocol error. The Client Error on the client socket might just be an adjacent error to the protocol error itself.

We will need wireshark GRPC dissector applied to the clientSocket and the serverSocket.

Establish a base case with 2 GRPC clients connected to a single GRPC server doing echo calls, and just download the GRPC log from wireshark. Then compare this to what happens when using the proxy, because the proxy is transparent. The logs should be equivalent. Then spot the difference in the frames.

@CMCDragonkai
Copy link
Member

@emmacasolin

Furthermore don't forget about the --verbose additional options to be passed to pkSpawn when doing this. That's required for the nodes to actually emit debug logs as subprocesses.

@CMCDragonkai
Copy link
Member

Further tests involving GRPC. We've created a simple test script that doesn't show this problem.

Note that I've changed:

  diff --git a/tests/grpc/utils/utils.ts b/tests/grpc/utils/utils.ts
  index 67370aba..879ba21a 100644
  --- a/tests/grpc/utils/utils.ts
  +++ b/tests/grpc/utils/utils.ts
  @@ -20,7 +20,7 @@ async function openTestServer(
     server.addService(TestServiceService, testService);
     const bindAsync = promisify(server.bindAsync).bind(server);
     const port = await bindAsync(
  -    `127.0.0.1:0`,
  +    `127.0.0.2:55554`,
       grpcUtils.serverInsecureCredentials(),
     );
     server.start();

as well in order to fix the ports.

import type { Host, Port } from './src/network/types';
import Logger, { LogLevel, StreamHandler, formatting } from '@matrixai/logger';
import { GRPCServer, utils as grpcUtils } from './src/grpc';
import { GRPCClientTest, openTestServer, closeTestServer, createTestService } from './tests/grpc/utils';
import { Proxy } from './src/network';
import * as grpc from '@grpc/grpc-js';
import * as keysUtils from './src/keys/utils';
import { sleep } from './src/utils';
import * as utilsPB from './src/proto/js/polykey/v1/utils/utils_pb';

async function main () {

  const loggerServer = new Logger('SERVER', LogLevel.DEBUG, [
    new StreamHandler(
      formatting.format`${formatting.level}:${formatting.keys}:${formatting.msg} - ${formatting.date}`
    ),
  ]);

  const loggerClient1 = new Logger('CLIENT1', LogLevel.DEBUG, [
    new StreamHandler(
      formatting.format`${formatting.level}:${formatting.keys}:${formatting.msg} - ${formatting.date}`
    ),
  ]);

  const loggerClient2 = new Logger('CLIENT2', LogLevel.DEBUG, [
    new StreamHandler(
      formatting.format`${formatting.level}:${formatting.keys}:${formatting.msg} - ${formatting.date}`
    ),
  ]);

  const authenticate = async (_metaClient, metaServer = new grpc.Metadata()) =>
    metaServer;

  const serverKeyPair = await keysUtils.generateKeyPair(1024);
  const serverKeyPairPem = keysUtils.keyPairToPem(serverKeyPair);
  const serverCert = keysUtils.generateCertificate(
    serverKeyPair.publicKey,
    serverKeyPair.privateKey,
    serverKeyPair.privateKey,
    12332432423,
  );
  const serverCertPem = keysUtils.certToPem(serverCert);
  const serverNodeId = keysUtils.certNodeId(serverCert)!;

  // TARGET GRPC SERVER
  const [server, serverPort] = await openTestServer(
    authenticate,
    loggerServer.getChild('GRPC SERVER')
  );

  const remoteProxy = new Proxy({
    authToken: 'abc',
    logger: loggerServer.getChild('PROXY'),
  });

  await remoteProxy.start({
    tlsConfig: {
      keyPrivatePem: serverKeyPairPem.privateKey,
      certChainPem: serverCertPem,
    },

    // FORWARD HOST & PORT
    forwardHost: '127.0.0.2' as Host,
    forwardPort: 55553 as Port,

    // GRPC SERVER HOST & PORT
    serverHost: '127.0.0.2' as Host,
    serverPort: serverPort as Port,

    // PROXY HOST & PORT
    proxyHost: '127.0.0.2' as Host,
    proxyPort: 55555 as Port,
  });

  // C? F[ ]P
  console.log('FORWARD');
  console.log(remoteProxy.getForwardHost(), remoteProxy.getForwardPort());

  console.log('PROXY');
  console.log(remoteProxy.getProxyHost(), remoteProxy.getProxyPort());

  // S R?[ ]P
  console.log('SERVER');
  console.log(remoteProxy.getServerHost(), remoteProxy.getServerPort());


  // CREATE THE FIRST CLIENT
  // this side does not have a GRPC server
  // it's not necessary

  const clientKeyPair1 = await keysUtils.generateKeyPair(1024);
  const clientKeyPairPem1 = keysUtils.keyPairToPem(clientKeyPair1);
  const clientCert1 = keysUtils.generateCertificate(
    clientKeyPair1.publicKey,
    clientKeyPair1.privateKey,
    clientKeyPair1.privateKey,
    12332432423,
  );
  const clientCertPem1 = keysUtils.certToPem(clientCert1);
  const clientNodeId1 = keysUtils.certNodeId(clientCert1)!;

  const localProxy1 = new Proxy({
    authToken: 'abc',
    logger: loggerClient1.getChild('PROXY'),
  });

  await localProxy1.start({
    tlsConfig: {
      keyPrivatePem: clientKeyPairPem1.privateKey,
      certChainPem: clientCertPem1,
    },
    forwardHost: '127.0.0.3' as Host,
    forwardPort: 44443 as Port,
    serverHost: '127.0.0.3' as Host,
    serverPort: 44444 as Port,
    proxyHost: '127.0.0.3' as Host,
    proxyPort: 44445 as Port,
  });

  const client1 = await GRPCClientTest.createGRPCClientTest({
    nodeId: serverNodeId,
    host: remoteProxy.getProxyHost(),
    port: remoteProxy.getProxyPort(),
    proxyConfig: {
      host: localProxy1.getForwardHost(),
      port: localProxy1.getForwardPort(),
      authToken: localProxy1.authToken,
    },
    logger: loggerClient1.getChild('GRPC CLIENT'),
  });

  // CREATE SECOND CLIENT
  const clientKeyPair2 = await keysUtils.generateKeyPair(1024);
  const clientKeyPairPem2 = keysUtils.keyPairToPem(clientKeyPair2);
  const clientCert2 = keysUtils.generateCertificate(
    clientKeyPair2.publicKey,
    clientKeyPair2.privateKey,
    clientKeyPair2.privateKey,
    12332432423,
  );
  const clientCertPem2 = keysUtils.certToPem(clientCert2);
  const clientNodeId2 = keysUtils.certNodeId(clientCert2)!;

  const localProxy2 = new Proxy({
    authToken: 'abc',
    logger: loggerClient2.getChild('PROXY'),
  });

  await localProxy2.start({
    tlsConfig: {
      keyPrivatePem: clientKeyPairPem2.privateKey,
      certChainPem: clientCertPem2,
    },
    forwardHost: '127.0.0.4' as Host,
    forwardPort: 33333 as Port,
    serverHost: '127.0.0.4' as Host,
    serverPort: 33334 as Port,
    proxyHost: '127.0.0.4' as Host,
    proxyPort: 33335 as Port,
  });

  const client2 = await GRPCClientTest.createGRPCClientTest({
    nodeId: serverNodeId,
    host: remoteProxy.getProxyHost(),
    port: remoteProxy.getProxyPort(),
    proxyConfig: {
      host: localProxy2.getForwardHost(),
      port: localProxy2.getForwardPort(),
      authToken: localProxy2.authToken,
    },
    logger: loggerClient2.getChild('GRPC CLIENT'),
  });

  console.log('SENDING REQUEST FROM CLIENT 1');

  const m = new utilsPB.EchoMessage();
  const challenge = 'Hello!';
  m.setChallenge(challenge);
  const unaryResponse = await client1.unary(m);
  console.log(unaryResponse.getChallenge());

  console.log('SENDED REQUEST FROM CLIENT 1');

  console.log('SENDING REQUEST FROM CLIENT 2');

  const m2 = new utilsPB.EchoMessage();
  const challenge2 = 'Hello!';
  m2.setChallenge(challenge2);
  const unaryResponse2 = await client2.unary(m2);
  console.log(unaryResponse2.getChallenge());

  console.log('SENDED REQUEST FROM CLIENT 2');

  // DESTROY FIRST CLIENT
  await client1.destroy();
  await localProxy1.stop();

  // DESTROY SECOND CLIENT
  await client2.destroy();
  await localProxy2.stop();

  // STOP THE SERVER
  await remoteProxy.stop();
  await closeTestServer(server);
}

main();

The packet logs showed a proper procedure for both clients talking to a single server. There are 12 (even number) amount of frame sequences being sent.

                44443     44445        55555      55553    55554
┌────────┐         ┌───────┐              ┌───────┐          ┌────────┐
│ Client ├─────────┤ Proxy ├──────────────┤ Proxy ├──────────┤ Server │
└────────┘         └───────┘              └───────┘          └────────┘

                                                 ◄─────────────1
                                                    SETTINGS
       2─────────────►
         MAGIC
         SETTINGS
         HEADER
         DATA
                                            3(2)───────────────►
                                                  MAGIC
                                                  SETTINGS
                                                  HEADER
                                                  DATA

       ◄─────────────4(1)
          SETTINGS


                                               ◄────────────────5
                                                  SETTINGS ACK
                                                  HEADERS
                                                  DATA

       6──────────────►
         SETTINGS ACK
                                               ◄────────────────7
                                                HEADERS TRAILING

       ◄───────────────8(5)
         SETTINGS ACK
         HEADERS
         DATA

                                             9(6)───────────────►
                                                  SETTINGS ACK
        ◄────────────────10(7)
         HEADERS TRAILING




       11────────────────►
            RST_STREAM

                                             12(11)──────────────►
                                                    RST_STREAM

@CMCDragonkai
Copy link
Member

We need to take the script I wrote above and incrementally build up to reproducing the problem that we are seeing in the tests/nat/noNAT.test.ts.

@CMCDragonkai
Copy link
Member

Also further note when you are seeing UDP packets of length 4, where the data is either 4 bytes of 00000000 or 4 bytes of 00000001, these are the ping/pong packets that serve both for NAT punching AND keep alive. They are sent quite continuously.

@tegefaulkes
Copy link
Contributor Author

I recreated the test using PolykeyAgents and it seems to work.
So it seems the problem is caused by either

  1. pk agent start BIN CLI command.
  2. pk agent start CLI command parameters
  3. some kind of interference with using a child_process to run the agent.

@CMCDragonkai
Copy link
Member

Since our script can generate the keys, we can export these to a file, and then use wireshark to also decrypt the UDP packets on the fly. More understanding of wireshark will be needed though. That might be useful @emmacasolin.

@tegefaulkes
Copy link
Contributor Author

tegefaulkes commented Apr 5, 2022

I may have found the problem... The test was trying to speed up generation of the root keypair by using the parameter --root-key-pair-bits 1024. Removing this parameter from each of the agents causes the test to run for much longer but agent 2 makes the connection and starts up without issue now. Looks like setting the root-key-pair-bits for the agents connecting to the seed node causes them to fail the connection.

The question now is, how does changing the root keypair size cause a protocol error in the GRPC? I'll dig into this more.

@CMCDragonkai
Copy link
Member

CMCDragonkai commented Apr 5, 2022

Could be a TLS issue. Some things require larger key sizes. This is another reason we want to move away from RSA since key sizes are not a thing in ECDSA... etc. #168

@CMCDragonkai
Copy link
Member

Are you sure this is the problem, is it just intermittent? What if you change in between, and try other key sizes. Or somewhere in between 1024 and 2048.

@tegefaulkes
Copy link
Contributor Author

tegefaulkes commented Apr 5, 2022

There is a pattern between the seed's keys bits and the agents. where Y means it works and N means it failed.
top is the agent key bits and the left are the seed's key bits.

- 1024 1100 1200 1300 1400 1500 2048 4096
1024 Y Y Y Y Y Y Y Y
2048 N N N Y Y Y Y Y
4096 N N N N Y Y Y Y

I still have no idea why the root keypair sizes would cause the GRPC to fail. still looking into it.

@CMCDragonkai
Copy link
Member

CMCDragonkai commented Apr 5, 2022

That shows that 1024 & 1024 works. Didn't you say that causes problems?

What is the row and column? Can you label them seeds vs agent.

@tegefaulkes
Copy link
Contributor Author

When I discovered the problem I was doing 4096 for the seed and 1024 for the agent.

@emmacasolin
Copy link
Contributor

When I discovered the problem I was doing 4096 for the seed and 1024 for the agent.

But in my no NAT tests I had the seed and both agents set to 1024 bits

@tegefaulkes
Copy link
Contributor Author

I still don't understand why it's happening. But with them all set to 1024 only agent2 had the problem. If I use the default for the seed and 1024 for the agents then both agent 1 and 2 fail.

@CMCDragonkai
Copy link
Member

Try reproducing this with the script that I created yesterday.

@tegefaulkes
Copy link
Contributor Author

Trying the same conditions in the test-connections.ts script. When using 4096 bits for the seed and 1024 bits for each agent the script runs fine with no problems. Both agents start and are able to echo the seed node.

Could this be a timing issue then?

@CMCDragonkai
Copy link
Member

I don't think key sizes are the problem here. Try and go back to what we discussed yesterday, slowly build up the test-connections.ts to eventually match what the JEST tests are doing.

@tegefaulkes
Copy link
Contributor Author

For reference, packet summaries;

working

1	0.000000000	127.0.0.1	127.0.0.1	TCP	74	48820 → 42531 [SYN] Seq=0 Win=65495 Len=0 MSS=65495 SACK_PERM=1 TSval=4060474312 TSecr=0 WS=128
2	0.000010900	127.0.0.1	127.0.0.1	TCP	74	42531 → 48820 [SYN, ACK] Seq=0 Ack=1 Win=65483 Len=0 MSS=65495 SACK_PERM=1 TSval=4060474312 TSecr=4060474312 WS=128
3	0.000019599	127.0.0.1	127.0.0.1	TCP	66	48820 → 42531 [ACK] Seq=1 Ack=1 Win=65536 Len=0 TSval=4060474312 TSecr=4060474312
4	0.001302487	127.0.0.1	127.0.0.1	HTTP	251	CONNECT 127.0.0.1:58531/?nodeId=v9t0fqdrkha7mi20f6rrqa1ggv5bqvjf5k8pucntina7sg41f8s20 HTTP/1.1 
5	0.001309787	127.0.0.1	127.0.0.1	TCP	66	42531 → 48820 [ACK] Seq=1 Ack=186 Win=65408 Len=0 TSval=4060474314 TSecr=4060474314
29	0.027683522	127.0.0.1	127.0.0.1	HTTP	105	HTTP/1.1 200 Connection Established 
30	0.027694722	127.0.0.1	127.0.0.1	TCP	66	48820 → 42531 [ACK] Seq=186 Ack=40 Win=65536 Len=0 TSval=4060474340 TSecr=4060474340
35	0.036526933	127.0.0.1	127.0.0.1	GRPC	257	Magic, SETTINGS[0], HEADERS[1]: POST /polykey.v1.AgentService/Echo, DATA[1] (GRPC) (PROTOBUF) polykey.v1.utils.EchoMessage
42	0.037837520	127.0.0.1	127.0.0.1	HTTP2	75	SETTINGS[0]
44	0.037901519	127.0.0.1	127.0.0.1	HTTP2	75	SETTINGS[0]
50	0.043565263	127.0.0.1	127.0.0.1	GRPC	199	SETTINGS[0], HEADERS[1]: 200 OK, DATA[1] (GRPC) (PROTOBUF) polykey.v1.utils.EchoMessage
55	0.083735460	127.0.0.1	127.0.0.1	TCP	66	48820 → 42531 [ACK] Seq=386 Ack=182 Win=65536 Len=0 TSval=4060474396 TSecr=4060474356
56	0.083754160	127.0.0.1	127.0.0.1	HTTP2	101	HEADERS[1]
57	0.083761859	127.0.0.1	127.0.0.1	TCP	66	48820 → 42531 [ACK] Seq=386 Ack=217 Win=65536 Len=0 TSval=4060474396 TSecr=4060474396
59	0.086871728	127.0.0.1	127.0.0.1	GRPC	203	RST_STREAM[1], HEADERS[3]: POST /polykey.v1.AgentService/NodesClosestLocalNodesGet, DATA[3] (GRPC) (PROTOBUF) polykey.v1.nodes.Node
66	0.089285804	127.0.0.1	127.0.0.1	GRPC	94	HEADERS[3]: 200 OK, DATA[3] (GRPC) (PROTOBUF) polykey.v1.nodes.NodeTable
72	0.129756998	127.0.0.1	127.0.0.1	TCP	66	48820 → 42531 [ACK] Seq=523 Ack=245 Win=65536 Len=0 TSval=4060474442 TSecr=4060474402
73	0.129779498	127.0.0.1	127.0.0.1	HTTP2	77	HEADERS[3]
74	0.129786198	127.0.0.1	127.0.0.1	TCP	66	48820 → 42531 [ACK] Seq=523 Ack=256 Win=65536 Len=0 TSval=4060474442 TSecr=4060474442
75	0.131604480	127.0.0.1	127.0.0.1	HTTP2	79	RST_STREAM[3]
79	0.171736077	127.0.0.1	127.0.0.1	TCP	66	42531 → 48820 [ACK] Seq=256 Ack=536 Win=65536 Len=0 TSval=4060474484 TSecr=4060474444
108	10.277305846	127.0.0.1	127.0.0.1	TCP	66	42531 → 48820 [FIN, ACK] Seq=256 Ack=536 Win=65536 Len=0 TSval=4060484590 TSecr=4060474444
109	10.277324646	127.0.0.1	127.0.0.1	TCP	66	48820 → 42531 [FIN, ACK] Seq=536 Ack=257 Win=65536 Len=0 TSval=4060484590 TSecr=4060484590
110	10.277330146	127.0.0.1	127.0.0.1	TCP	66	42531 → 48820 [ACK] Seq=257 Ack=537 Win=65536 Len=0 TSval=4060484590 TSecr=4060484590

failing

5	37.842278020	127.0.0.1	127.0.0.1	TCP	74	59728 → 42217 [SYN] Seq=0 Win=65495 Len=0 MSS=65495 SACK_PERM=1 TSval=4061003826 TSecr=0 WS=128
6	37.842290720	127.0.0.1	127.0.0.1	TCP	74	42217 → 59728 [SYN, ACK] Seq=0 Ack=1 Win=65483 Len=0 MSS=65495 SACK_PERM=1 TSval=4061003826 TSecr=4061003826 WS=128
7	37.842299620	127.0.0.1	127.0.0.1	TCP	66	59728 → 42217 [ACK] Seq=1 Ack=1 Win=65536 Len=0 TSval=4061003826 TSecr=4061003826
8	37.843652008	127.0.0.1	127.0.0.1	HTTP	267	CONNECT 127.0.0.1:36866/?nodeId=vg7fmjr9u0an3m8a2ro7u3ic8rtfgkrqf2pm785vg63ptb0vso7ug HTTP/1.1 
9	37.843658708	127.0.0.1	127.0.0.1	TCP	66	42217 → 59728 [ACK] Seq=1 Ack=202 Win=65408 Len=0 TSval=4061003827 TSecr=4061003827
35	37.870075187	127.0.0.1	127.0.0.1	HTTP	105	HTTP/1.1 200 Connection Established 
36	37.870084887	127.0.0.1	127.0.0.1	TCP	66	59728 → 42217 [ACK] Seq=202 Ack=40 Win=65536 Len=0 TSval=4061003853 TSecr=4061003853
37	37.870268785	127.0.0.1	127.0.0.1	HTTP2	75	SETTINGS[0]
38	37.870275285	127.0.0.1	127.0.0.1	TCP	66	59728 → 42217 [ACK] Seq=202 Ack=49 Win=65536 Len=0 TSval=4061003854 TSecr=4061003854
40	37.877249126	127.0.0.1	127.0.0.1	GRPC	257	Magic, SETTINGS[0], HEADERS[1]: POST /polykey.v1.AgentService/Echo, DATA[1] (GRPC) (PROTOBUF) polykey.v1.utils.EchoMessage
47	37.882938379	127.0.0.1	127.0.0.1	GRPC	199	SETTINGS[0], HEADERS[1]: 200 OK, DATA[1] (GRPC) (PROTOBUF) polykey.v1.utils.EchoMessage
52	37.883944870	127.0.0.1	127.0.0.1	HTTP2	83	GOAWAY[0]
53	37.886253151	127.0.0.1	127.0.0.1	TCP	66	59728 → 42217 [FIN, ACK] Seq=410 Ack=182 Win=65536 Len=0 TSval=4061003870 TSecr=4061003866
54	37.886587448	127.0.0.1	127.0.0.1	HTTP2	101	HEADERS[1]
55	37.886597248	127.0.0.1	127.0.0.1	TCP	54	59728 → 42217 [RST] Seq=411 Win=0 Len=0

@CMCDragonkai
Copy link
Member

There are 7 places where RST_STREAM is emitted https://github.com/grpc/grpc-node/blob/master/packages/grpc-js/src/call-stream.ts.

By going into the node_modules and monkey patching some console.log you can trace backwards to the origin of this error too.

@tegefaulkes
Copy link
Contributor Author

Looks like the error is caused by a HTTP 505 protocol error. 505 refers to a 505 HTTP Version Not Supported https://web.archive.org/web/20150924132744/http://www.checkupdown.com/status/E505.html. The full error is;

    Error [ERR_HTTP2_ERROR]: Protocol error
        at Http2Session.onSessionInternalError (internal/http2/core.js:759:26) {
      code: 'ERR_HTTP2_ERROR',
      errno: -505
    }

So it seems to ultimately be a problem with http2 negotiating the connection maybe?

@CMCDragonkai
Copy link
Member

Can we found out what the HTTP version is being supplied? As in where is the condition? The MDN says this is because

The HyperText Transfer Protocol (HTTP) 505 HTTP Version Not Supported response status code indicates that the HTTP version used in the request is not supported by the server.

@tegefaulkes
Copy link
Contributor Author

Relevant documentation about HTTP2 negotiation.

[  
3.5](https://www.rfc-editor.org/rfc/rfc7540#section-3.5).  HTTP/2 Connection Preface

   In HTTP/2, each endpoint is required to send a connection preface as
   a final confirmation of the protocol in use and to establish the
   initial settings for the HTTP/2 connection.  The client and server
   each send a different connection preface.

   The client connection preface starts with a sequence of 24 octets,
   which in hex notation is:

     0x505249202a20485454502f322e300d0a0d0a534d0d0a0d0a

   That is, the connection preface starts with the string "PRI *
   HTTP/2.0\r\n\r\nSM\r\n\r\n").  This sequence MUST be followed by a
   SETTINGS frame ([Section 6.5](https://www.rfc-editor.org/rfc/rfc7540#section-6.5)), which MAY be empty.  The client sends
   the client connection preface immediately upon receipt of a 101
   (Switching Protocols) response (indicating a successful upgrade) or
   as the first application data octets of a TLS connection.  If
   starting an HTTP/2 connection with prior knowledge of server support
   for the protocol, the client connection preface is sent upon
   connection establishment.

      Note: The client connection preface is selected so that a large
      proportion of HTTP/1.1 or HTTP/1.0 servers and intermediaries do
      not attempt to process further frames.  Note that this does not
      address the concerns raised in [[TALKING](https://www.rfc-editor.org/rfc/rfc7540#ref-TALKING ""Talking to Yourself for Fun and Profit"")].

   The server connection preface consists of a potentially empty
   SETTINGS frame ([Section 6.5](https://www.rfc-editor.org/rfc/rfc7540#section-6.5)) that MUST be the first frame the server
   sends in the HTTP/2 connection.

   The SETTINGS frames received from a peer as part of the connection
   preface MUST be acknowledged (see [Section 6.5.3](https://www.rfc-editor.org/rfc/rfc7540#section-6.5.3)) after sending the
   connection preface. To avoid unnecessary latency, clients are permitted to send
   additional frames to the server immediately after sending the client
   connection preface, without waiting to receive the server connection
   preface.  It is important to note, however, that the server
   connection preface SETTINGS frame might include parameters that
   necessarily alter how a client is expected to communicate with the
   server.  Upon receiving the SETTINGS frame, the client is expected to
   honor any parameters established.  In some configurations, it is
   possible for the server to transmit SETTINGS before the client sends
   additional frames, providing an opportunity to avoid this issue.

   Clients and servers MUST treat an invalid connection preface as a
   connection error ([Section 5.4.1](https://www.rfc-editor.org/rfc/rfc7540#section-5.4.1)) of type PROTOCOL_ERROR.  A GOAWAY
   frame ([Section 6.8](https://www.rfc-editor.org/rfc/rfc7540#section-6.8)) MAY be omitted in this case, since an invalid
   preface indicates that the peer is not using HTTP/2.

@CMCDragonkai
Copy link
Member

That seems to mean that the server can send SETTINGS frame before client "sends additional frames". But I'm not sure if that means that it's wrong for the server to send SETTINGS frame before client does, or if just means that the server can send SETTINGS frame but only after client has sent it regardless of HEADERS and DATA frame. It's also possible that the server sent SETTINGS frame itself is corrupted, need to compare it with the happy case.

Furthermore if this is occurring inside the NodeJS HTTP2 core, that would be here: https://github.com/nodejs/node/blob/v14.17.3/lib/internal/http2/core.js#L755-L760

@CMCDragonkai
Copy link
Member

And then:

// When an error occurs internally at the binding level, immediately
// destroy the session.
function onSessionInternalError(code) {
  if (this[kOwner] !== undefined)
    this[kOwner].destroy(new NghttpError(code));
}

Is passed into the C++ bindings later at:

https://github.com/nodejs/node/blob/9fb7b48c5ed7707fbc4007a2e3fc99aec7587d8b/lib/internal/http2/core.js#L3294-L3306

binding.setCallbackFunctions(
  onSessionInternalError,
  onPriority,
  onSettings,
  onPing,
  onSessionHeaders,
  onFrameError,
  onGoawayData,
  onAltSvc,
  onOrigin,
  onStreamTrailers,
  onStreamClose
);

@CMCDragonkai
Copy link
Member

CMCDragonkai commented Apr 6, 2022

Could this be the bug? nodejs/node#35475

The bug doesn't reproduce anymore in our node version v14.17.3.

@CMCDragonkai
Copy link
Member

Can we use https://nodejs.org/api/http2.html#event-remotesettings to try and see if that event gets emitted for the HTTP2Session inside grpc-js library. Will require some monkeypatching here.

@tegefaulkes
Copy link
Contributor Author

tegefaulkes commented Apr 6, 2022

Using the env variables NODE_DEBUG=http2*,stream* NODE_DEBUG_NATIVE=http2 we see this in the logs (they get applied to the node internal runtime).

Http2Session client (12021) Error 'Remote peer returned unexpected data while we expected SETTINGS frame.  Perhaps, peer does not support HTTP/2 properly.'
HTTP2 621866: Http2Session client: destroying
HTTP2 621866: Http2Session client: start closing/destroying Error [ERR_HTTP2_ERROR]: Protocol error
    at Http2Session.onSessionInternalError (internal/http2/core.js:759:26) {
  code: 'ERR_HTTP2_ERROR',
  errno: -505
}
HTTP2 621866: Http2Stream 1 [Http2Session client]: destroying stream

@CMCDragonkai
Copy link
Member

CMCDragonkai commented Apr 6, 2022

Current theory:

  1. The GRPC server can and will send SETTINGS frame upon TCP handshake. It doesn't need to wait for the client to send the connection preface. We see this occurring when wiresharking between proxy to GRPC server. This may be occurring due to how we set up the reverse proxy, that is we maybe setting up a connection between proxy to GRPC server before we have composed it with the client proxy.
  2. The GRPC client is expecting to see a SETTINGS frame but it doesn't see it. We see that the SETTINGS frame is in fact sent by the proxy to the GRPC client. But it's possible that the Node HTTP2 core implementation is not reading that frame data or dropping or something else.

My reading of the HTTP2 spec is that isn't entirely clear whether it's allowed for the server to send the SETTINGS frame to the client before receiving client connection preface. But the implementation of the GRPC server appears to be doing this. And if this is the case, then perhaps the Node HTTP2 core implementation might be a buggy here in that it's missing the SETTINGS frame.

This is why we need add an event handler remoteSettings to the HTTP2Session object, and see if it is in fact receiving the settings frame. And if that event is not emitted, then we can be sure that settings frame is being dropped.

The next step after this might be to report it upstream, while a workaround would be on how we do our reverse proxy connection setup, maybe not create the TCP connection between proxy to GRPC server UNTIL we have composed a connection first. A further workaround might be the plug the reverse side UTP stream, until we have seen data coming from the forward side, this would allow us to provide a traffic stop for the GRPC server's SETTINGS frame, and always ensure that it would delivered only after the GRPC client has sent its connection preface.

@tegefaulkes
Copy link
Contributor Author

So yes, in the working case the client receives a 'remoteSettings' event and in the failing case it does not receive that event. So we can confirm that the Settings frames are causing the problem for the failing case.

@tegefaulkes
Copy link
Contributor Author

I've implemented a solution. We can delay the proxy form sending the Settings frame out of order by waiting for data to be received first.

On the forward proxy way can delay sending data to the client before we receive data from the client by using clientSocket.once('data', _=> ...);. I tried to get this to work with cork or pause but it doesn't seem to work the way I want here. so as an alternative I delay piping data from the tlsSocket to the clientSocket until we see some data coming from the client socket.

      clientSocket.pipe(this.tlsSocket, { end: false });
      clientSocket.once('data', async d => {
        this.tlsSocket.pipe(clientSocket, { end: false });
      });

This fixes the problematic test in nattesting.

tegefaulkes added a commit that referenced this issue Apr 6, 2022
There was an issue in obscure conditions that caused certain packets to be out of the expected order for HTTP2 leading to a protocol error. This ensure that the `MAGIC` frame comes before the server's `SETTING` frame.

#369
tegefaulkes added a commit that referenced this issue Apr 6, 2022
There was an issue in obscure conditions that caused certain packets to be out of the expected order for HTTP2 leading to a protocol error. This ensure that the `MAGIC` frame comes before the server's `SETTING` frame.

#369
tegefaulkes added a commit that referenced this issue Apr 6, 2022
tegefaulkes added a commit that referenced this issue Apr 6, 2022
tegefaulkes added a commit that referenced this issue Apr 7, 2022
There was an issue in obscure conditions that caused certain packets to be out of the expected order for HTTP2 leading to a protocol error. This ensure that the `MAGIC` frame comes before the server's `SETTING` frame.

Updated `NodeConnectionManager` lifecycle logging.

`Proxy` and `NodeConnection` rejects connections to wildcard `0.0.0.0` and `::` addresses. Fixed up any tests broken by this change.

Added a test to check if you can connect to the target through the proxies after the proxies have created the UTP connection.

Fixes #369
tegefaulkes added a commit that referenced this issue Apr 7, 2022
There was an issue in obscure conditions that caused certain packets to be out of the expected order for HTTP2 leading to a protocol error. This ensure that the `MAGIC` frame comes before the server's `SETTING` frame.

Updated `NodeConnectionManager` lifecycle logging.

`Proxy` and `NodeConnection` rejects connections to wildcard `0.0.0.0` and `::` addresses. Fixed up any tests broken by this change.

Added a test to check if you can connect to the target through the proxies after the proxies have created the UTP connection.

Fixes #369
@teebirdy teebirdy added the r&d:polykey:core activity 3 Peer to Peer Federated Hierarchy label Jul 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working development Standard development r&d:polykey:core activity 3 Peer to Peer Federated Hierarchy
Development

Successfully merging a pull request may close this issue.

4 participants