Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash after 401 error #3503

Closed
duncangroenewald opened this issue Jan 14, 2021 · 2 comments · Fixed by #3505
Closed

Crash after 401 error #3503

duncangroenewald opened this issue Jan 14, 2021 · 2 comments · Fixed by #3505
Assignees

Comments

@duncangroenewald
Copy link

duncangroenewald commented Jan 14, 2021

Goals

We are trying to load data from an existing v10 local realm file into a synced MongoDB Realm using a javascript script. This same script was successfully used previously to load data into a Realm Cloud synced realm. Some modifications have been made for MongoDB Realm.

Expected Results

Expect the data to load into the synced realm and expect the data to be synced in the same manner as Realm Cloud

Actual Results

The script runs and successfully populates the synced realms local realm file - subsequent queries of the local synced realm returns the correct number of records for each object type (or table).

However the sync fails to complete with what appear to be different errors each time the script is run.

Steps to Reproduce

  1. Create a MongoDB Realm app and setup the Sync with the Atlas cluster as per usual instructions. In our case we use email/password authentication
  2. Set the Realm App to run in Development mode in order to import the Schema from the client application.
  3. Set the appId and email/password in the client script and set the source realm filename.
  4. Run the script

When running the script is appears to complete copying all the data to the synced realms local file but the sync process fails and disconnects from the server.

Subsequent attempts to restart the script in query mode (i.e. don't copy data just query the synced realm data) results in the call to open the realm failing to return. Similarly attempting to connect the same Realm App from another client also fails to return from the call to open the realm.

In an attempt to fix this we added a delay between each write transaction and then when running the script it completes copying the data to the synced realm local file and then the background sync process starts syncing data but fails with a "Bad sync process (7)" error.

Restarting the script in query mode (i.e. script does not load data from the source file but simply queries the synced realm data) seems to resume the sync process which again fails are a period of time with the same error as above.

After a number of restarts the script eventually completed syncing the data (uploaded all the change sets).

However this does not always work reliably.

Below is an example of the failure from the client log

SYNC: [2] Connection[1]: Session[1]: Upload compaction: original size = 136678, compacted size = 136678
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [2] Connection[1]: Session[1]: Progress handler called, downloaded = 4980, downloadable(total) = 4980, uploaded = 53312310, uploadable = 55405067, reliable_download_progress = 1, snapshot version = 606
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
UPLOAD: [53312310] 55405067
UPLOAD: [53312310] 55405067
SYNC: [2] Connection[1]: Session[1]: Progress handler called, downloaded = 4980, downloadable(total) = 4980, uploaded = 53312310, uploadable = 55405067, reliable_download_progress = 1, snapshot version = 606
SYNC: [1] Connection[1]: Download message compression: is_body_compressed = 0, compressed_body_size=0, uncompressed_body_size=0
SYNC: [2] Connection[1]: Session[1]: Received: DOWNLOAD(download_server_version=295, download_client_version=584, latest_server_version=295, latest_server_version_salt=3230038699550877483, upload_client_version=584, upload_server_version=287, downloadable_bytes=0, num_changesets=0, ...)
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [2] Connection[1]: Session[1]: Sending: UPLOAD(progress_client_version=607, progress_server_version=295, locked_server_version=295, num_changesets=0)
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
UPLOAD: [53360289] 55405067
SYNC: [2] Connection[1]: Session[1]: Progress handler called, downloaded = 4980, downloadable(total) = 4980, uploaded = 53360289, uploadable = 55405067, reliable_download_progress = 1, snapshot version = 607
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [2] Connection[1]: Session[1]: Sending: UPLOAD(progress_client_version=608, progress_server_version=295, locked_server_version=295, num_changesets=1)
SYNC: [1] Connection[1]: Session[1]: Fetching changeset for upload (client_version=608, server_version=295, changeset_size=136309, origin_timestamp=190594238985, origin_file_ident=0)
SYNC: [1] Connection[1]: Session[1]: Changeset: 3F 00 0E 41 73 73 6F 72 74 6D 65 6E 74 49 74 65 6D 3F 01 0F 41 73 73 6F 72 74 6D 65 6E 74 53 68 65 6C 66 3F 02 24 45 43 43 39 39 35 37 38 2D 45 41 45 44 2D 34 46 36 32 2D 42 30 44 31 2D 35 37 43 44 37 35 45 44 38 43 35 35 3F 03 24 37 43 43 36 46 38 42 46 2D 36 30 41 46 2D 34 35 39 39 2D 38 41 36 44 2D 33 33 34 39 43 30 43 32 31 39 44 34 3F 04 05 73 68 65 6C 66 04 00 03 03 04 00 09 01 03 02 00 3F 05 07 50 72 6F 64 75 63 74 3F 06 24 37 43 31 32 37 36 45 31 2D 43 42 35 34 2D 34 46 32 43 2D 38 30 32 43 2D 43 35 42 32 45 42 44 37 34 39 45 46 3F 07 07 70 72 6F 64 75 63 74 04 00 03 03 07 00 09 05 03 06 00 3F 08 24 33 31 35 35 37 33 38 39 2D 31 38 38 41 2D 34 38 38 33 2D 39 30 35 44 2D 34 31 35 31 30 36 42 35 45 30 43 41 3F 09 24 44 37 36 37 41 31 34 46 2D 42 32 39 43 2D 34 33 38 35 2D 42 32 44 35 2D 45 33 46 44 38 30 38 43 44 43 38 42 04 00 03 09 04 00 09 01 03 08 00 3F 0A 24 41 31 33 37 34 46 46 43 2D 44 33 39 42 2D 34 44 46 34 2D 42 45 34 38 2D 31 39 30 46 46 30 37 44 43 31 44 45 04 00 03 09 07 00 09 05 03 0A 00 3F 0B 24 30 46 42 38 34 39 38 41 2D 38 42 37 42 2D 34 38 42 30 2D 41 35 38 33 2D 33 43 43 38 42 44 30 30 35 43 45 38 3F 0C 24 37 44 33 43 37 37 43 44 2D 46 45 38 33 2D 34 38 38 36 2D 38 42 44 37 2D 31 37 37 36 42 45 34 30 45 37 37 41 04 00 03 0C 04 00 09 01 03 0B 00 3F 0D 24 38 34 31 31 36 31 42 43 2D 45 37 37 30 2D 34 32 41 35 2D 39 43 43 37 2D 38 43 38 35 44 44 39 39 42 37 36 32 04 00 03 0C 07 00 09 05 03 0D 00 3F 0E 24 31 38 44 37 46 37 31 41 2D 45 43 35 32 2D 34 34 44 31 2D 42 45 46 42 2D 42 42 43 39 46 45 36 35 34 30 42 44 3F 0F 24 39 32 42 44 34 45 44 38 2D 43 37 42 38 2D 34 36 45 44 2D 38 45 31 32 2D 46 31 42 44 44 34 34 44 41 46 30 30 04 00 03 0F 04 00 09 01 03 0E 00 3F 10 24 37 35 43 33 44 46 45 35 2D 31 43 39 36 2D 34 37 31 42 2D 42 32 36 35 2D 37 39 41 41 32 43 37 46 44 36 44 39 04 00 03 0F 07 00 09 05 03 10 00 3F 11 24 42 45 42 32 35 31 38 44 2D 44 33 45 30 2D 34 33 46 31 2D 38 44 37 35 2D 34 45 38 31 32 43 44 37 44 31 32 37 3F 12 24 32 45 46 36 43 43 45 30 2D 36 32 37 31 2D 34 32 38 37 2D 38 32 33 42 2D 35 42 39 41 42 43 45 36 32 32 36 32 04 00 03 12 04 00 09 01 03 11 00 3F 13 24 36 44 43 46 30 34 45 42 2D 45 37 36 34 2D 34 35 31 46 2D 42 41 31 31 2D 36 34 35 35 41 37 39 35 34 32 43 39 04 00 03 12 07 00 09 05 03 13 00 3F 14 24 39 38 43 37 43 32 39 38 2D 36 37 33 45 2D 34 37 45 38 2D 38 37 30 46 2D 41 45 32 31 34 34 32 41 30 30 30 35 3F 15 24 30 44 41 46 32 34 46 41 2D 41 45 43 37 2D 34 44 30 32 2D 42 43 46 43 2D 38 31 38 43 31 43 37 43 39 46 39 39 04 00 03 15 04 00 09 01 03 14 00 3F 16 24 31 30 36 43 38 30 42 35 2D 34 33 34 39 2D 34 42 32 39 2D 38 45 45 39 2D 44 31 41 35 36 39 45 43 35 41 44 43 04 00 03 15 07 00 09 05 03 16 00 3F 17 24 36 43 46 36 38 39 46 37 2D 43 39 45 45 2D 34 32 36 32 2D 38 38 33 37 2D 32 41 34 44 46 39 33 30 44 44 35 37 3F 18 24 31 42 39 37 34 32 35 35 2D 43 38 32 38 2D 34 43 46 31 2D 42 33 39 43 2D 44 44 34 44 42 42 33 36 46 38 33 43 04 00 03 18 04 00 09 01 03 17 00 3F 19 24 33 39 36 36 34 41 31 35 2D 41 33 35 30 2D 34 32 46 31 2D 39 45 30 38 2D 37 42 45 31 30 38 34 39 42 43 38...
SYNC: [2] Connection[1]: Session[1]: Upload compaction: original size = 136309, compacted size = 136309
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [2] Connection[1]: Session[1]: Progress handler called, downloaded = 4980, downloadable(total) = 4980, uploaded = 53360289, uploadable = 55541376, reliable_download_progress = 1, snapshot version = 608
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [2] Connection[1]: Session[1]: Progress handler called, downloaded = 4980, downloadable(total) = 4980, uploaded = 53360289, uploadable = 55541376, reliable_download_progress = 1, snapshot version = 608
UPLOAD: [53360289] 55541376
UPLOAD: [53360289] 55541376
SYNC: [1] Connection[1]: Download message compression: is_body_compressed = 0, compressed_body_size=0, uncompressed_body_size=0
SYNC: [2] Connection[1]: Session[1]: Received: DOWNLOAD(download_server_version=296, download_client_version=585, latest_server_version=296, latest_server_version_salt=4348031219651517719, upload_client_version=586, upload_server_version=288, downloadable_bytes=0, num_changesets=0, ...)
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [2] Connection[1]: Session[1]: Sending: UPLOAD(progress_client_version=609, progress_server_version=296, locked_server_version=296, num_changesets=0)
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
UPLOAD: [53497539] 55541376
SYNC: [2] Connection[1]: Session[1]: Progress handler called, downloaded = 4980, downloadable(total) = 4980, uploaded = 53497539, uploadable = 55541376, reliable_download_progress = 1, snapshot version = 609
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [2] Connection[1]: Session[1]: Sending: UPLOAD(progress_client_version=610, progress_server_version=296, locked_server_version=296, num_changesets=1)
SYNC: [1] Connection[1]: Session[1]: Fetching changeset for upload (client_version=610, server_version=296, changeset_size=134880, origin_timestamp=190594249151, origin_file_ident=0)
SYNC: [1] Connection[1]: Session[1]: Changeset: 3F 00 0E 41 73 73 6F 72 74 6D 65 6E 74 49 74 65 6D 3F 01 0F 41 73 73 6F 72 74 6D 65 6E 74 53 68 65 6C 66 3F 02 24 39 44 36 32 46 30 45 33 2D 37 44 38 43 2D 34 32 39 34 2D 38 43 31 38 2D 32 38 35 32 46 42 32 34 31 31 42 35 3F 03 24 32 33 43 39 37 36 30 39 2D 36 43 41 43 2D 34 30 46 31 2D 41 35 35 34 2D 36 36 39 33 39 33 37 42 32 44 46 30 3F 04 05 73 68 65 6C 66 04 00 03 03 04 00 09 01 03 02 00 3F 05 07 50 72 6F 64 75 63 74 3F 06 24 35 44 46 43 38 36 42 44 2D 33 38 36 46 2D 34 36 39 31 2D 42 31 31 36 2D 38 34 33 35 43 32 33 39 41 46 31 34 3F 07 07 70 72 6F 64 75 63 74 04 00 03 03 07 00 09 05 03 06 00 3F 08 24 34 30 38 37 33 34 39 39 2D 32 31 45 38 2D 34 33 39 33 2D 41 43 46 38 2D 43 44 37 33 42 38 44 36 45 45 38 32 3F 09 24 35 39 42 38 43 42 35 32 2D 37 39 39 39 2D 34 42 46 39 2D 42 44 34 45 2D 36 42 42 37 34 44 44 37 30 38 33 37 04 00 03 09 04 00 09 01 03 08 00 3F 0A 24 30 39 38 46 37 43 46 44 2D 43 35 38 38 2D 34 42 32 31 2D 39 46 34 30 2D 41 34 41 39 46 44 43 42 39 37 35 37 04 00 03 09 07 00 09 05 03 0A 00 3F 0B 24 31 36 39 30 30 43 38 43 2D 33 44 32 37 2D 34 32 42 35 2D 38 34 45 43 2D 34 35 42 38 42 42 38 34 43 32 32 34 3F 0C 24 36 32 46 37 31 37 37 44 2D 39 39 42 37 2D 34 44 43 37 2D 41 36 42 39 2D 46 36 41 34 30 43 34 37 38 33 32 46 04 00 03 0C 04 00 09 01 03 0B 00 3F 0D 24 43 46 34 38 43 39 31 35 2D 34 38 46 36 2D 34 39 46 38 2D 39 42 39 33 2D 32 30 37 33 34 38 46 42 41 31 38 35 04 00 03 0C 07 00 09 05 03 0D 00 3F 0E 24 37 45 37 38 31 41 38 39 2D 42 34 41 39 2D 34 46 38 32 2D 39 32 31 44 2D 44 30 36 46 38 44 32 43 32 39 39 33 3F 0F 24 43 39 41 45 35 33 33 32 2D 39 42 36 30 2D 34 41 39 46 2D 39 33 42 45 2D 33 34 38 33 35 36 44 41 32 43 38 46 04 00 03 0F 04 00 09 01 03 0E 00 3F 10 24 42 41 44 34 30 42 37 32 2D 36 31 41 44 2D 34 46 44 30 2D 41 30 39 36 2D 32 33 44 44 46 35 39 43 45 34 32 46 04 00 03 0F 07 00 09 05 03 10 00 3F 11 08 70 72 6F 64 75 63 74 73 3F 12 24 32 44 30 35 35 37 42 34 2D 44 42 34 36 2D 34 38 39 45 2D 41 30 43 36 2D 41 32 39 43 37 37 42 44 43 43 45 38 08 00 03 0F 11 01 00 09 05 03 12 00 3F 13 24 35 45 42 39 46 46 34 38 2D 34 42 34 37 2D 34 43 39 33 2D 42 42 39 41 2D 42 39 31 41 38 31 32 34 42 45 41 46 08 00 03 0F 11 01 01 09 05 03 13 01 3F 14 24 33 35 31 45 31 35 38 30 2D 46 38 37 37 2D 34 32 41 33 2D 42 39 30 41 2D 30 36 43 37 45 34 45 35 32 37 38 39 08 00 03 0F 11 01 02 09 05 03 14 02 3F 15 24 32 30 36 36 31 38 38 35 2D 39 37 33 32 2D 34 39 41 41 2D 42 36 31 41 2D 33 35 43 39 46 46 46 42 43 35 46 45 3F 16 24 35 35 45 31 31 39 46 44 2D 30 43 31 32 2D 34 37 37 35 2D 38 41 34 39 2D 38 38 37 36 30 43 31 39 36 34 37 38 04 00 03 16 04 00 09 01 03 15 00 3F 17 24 45 45 32 41 41 36 42 30 2D 37 44 33 43 2D 34 37 41 38 2D 38 35 43 31 2D 35 34 34 42 33 42 35 46 43 31 34 34 04 00 03 16 07 00 09 05 03 17 00 3F 18 24 45 35 30 42 41 38 30 31 2D 46 37 34 36 2D 34 46 35 31 2D 39 36 44 33 2D 31 41 30 37 37 45 39 45 43 44 38 43 3F 19 24 36 31 35 37 38 39 42 44 2D 33 31 35 41 2D 34 30 34 45 2D 39 31 34 45 2D 46 42 42 44 41 45 45 42 34 33 30 46 04 00 03 19 04 00 09 01 03 18 00 3F 1A 24 33 33 32 33 39 32 34 37 2D 32...
SYNC: [2] Connection[1]: Session[1]: Upload compaction: original size = 134880, compacted size = 134880
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
SYNC: [2] Connection[1]: Session[1]: Progress handler called, downloaded = 4980, downloadable(total) = 4980, uploaded = 53497539, uploadable = 55676256, reliable_download_progress = 1, snapshot version = 610
SYNC: [1] Using already open Realm file: /Users/duncangroenewald/Development/RealmMigrationMongoDB/mongodb-realm/makespace-development-qkudm/5fffc37714c08f5936e29aef/s_default.realm
UPLOAD: [53497539] 55676256
UPLOAD: [53497539] 55676256
SYNC: [2] Connection[1]: Session[1]: Progress handler called, downloaded = 4980, downloadable(total) = 4980, uploaded = 53497539, uploadable = 55676256, reliable_download_progress = 1, snapshot version = 610
SYNC: [2] Connection[1]: Timeout on reception of PONG message
SYNC: [4] Connection[1]: Connection closed due to error
SYNC: [3] Connection[1]: Resolving 'ws.realm.mongodb.com:443'
SYNC: [3] Connection[1]: Connecting to endpoint '52.64.157.195:443' (1/1)
SYNC: [4] Connection[1]: Connected to endpoint '52.64.157.195:443' (from '10.0.1.171:52704')
SYNC: [2] Connection[1]: WebSocket::initiate_client_handshake()
SYNC: [1] Connection[1]: HTTP request =
GET /api/client/v2.0/app/makespace-development-qkudm/realm-sync HTTP/1.1
Host: ws.realm.mongodb.com
Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJiYWFzX2RldmljZV9pZCI6IjYwMDBiZTRlNDE2ODQxMTdiOTMxY2M3NCIsImJhYXNfZG9tYWluX2lkIjoiNWZmZmMyZjA2YzEwMjRmZmU3ZDIzYjYwIiwiZXhwIjoxNjEwNjYzMjU0LCJpYXQiOjE2MTA2NjE0NTQsImlzcyI6IjYwMDBiZTRlNDE2ODQxMTdiOTMxY2M3YiIsInN0aXRjaF9kZXZJZCI6IjYwMDBiZTRlNDE2ODQxMTdiOTMxY2M3NCIsInN0aXRjaF9kb21haW5JZCI6IjVmZmZjMmYwNmMxMDI0ZmZlN2QyM2I2MCIsInN1YiI6IjVmZmZjMzc3MTRjMDhmNTkzNmUyOWFlZiIsInR5cCI6ImFjY2VzcyJ9.kaa5w9KLthss_gpjbXRnW43wL_UOPEDJJvNzsALWBzk
Connection: Upgrade
Sec-WebSocket-Key: Io0TJ8lmCFnJap9VUxHxew==
Sec-WebSocket-Protocol: com.mongodb.realm-sync/2
Sec-WebSocket-Version: 13
Upgrade: websocket
User-Agent: RealmSync/10.1.5 (macOS Darwin 20.2.0 Darwin Kernel Version 20.2.0: Wed Dec  2 20:40:21 PST 2020; root:xnu-7195.60.75~1/RELEASE_ARM64_T8101 x86_64) RealmJS/10.1.2 (node.js, darwin, vv12.20.0)


SYNC: [2] Connection[1]: WebSocket::handle_http_response_received()
SYNC: [1] Connection[1]: HTTP response = HTTP/1.1 401 Unauthorized
cache-control: no-cache, no-store, must-revalidate
connection: close
content-length: 190
content-type: application/json
date: Thu, 14 Jan 2021 22:50:55 GMT
server: envoy
vary: Origin
x-envoy-max-retries: 0
x-frame-options: DENY


SYNC: [6] Connection[1]: Websocket: Expected HTTP response 101 Switching Protocols, but received:
HTTP/1.1 401 Unauthorized
cache-control: no-cache, no-store, must-revalidate
connection: close
content-length: 190
content-type: application/json
date: Thu, 14 Jan 2021 22:50:55 GMT
server: envoy
vary: Origin
x-envoy-max-retries: 0
x-frame-options: DENY


SYNC: [4] Connection[1]: Connection closed due to error

#
# Fatal error in HandleScope::HandleScope
# Entering the V8 API without proper locking in place
#

zsh: illegal hardware instruction  node --max-old-space-size=2048 MigrateLocalToSync.js

Code Sample

// Copy local realm to MongoDB Realm synced realm
// usage: node MigrateLocalToSync.js

// Required Prework
// 1. Create the MongoDB Realm application and get the AppID
// 2. Create the realm user (use email/password authentication)
// 3. Copy the realm file (with '_id' primary keys) to the local folder
// 4. Set the correct appId and username and password for MongoDB Realm
// 5. Run this script
//
// If the default realm exists then delete it and create a new one.  We have not tested
// deleting all existing objects rather than creating a new default realm.
// Use Realm Studio to create the Realm Cloud default realm
//const Buffer = require('buffer').Buffer;

// Set higher memory limit
// node --max-old-space-size=4096 index.js #increase to 4gb

// UPDATE THESE
const appId = "xxxxx";
const username = 'xxxxx';              // this is the user doing the copy
const password = 'xxxxx';
const source_realm_path = './Source.realm';    // path on disk
const partitionKey = 'default';           // the partition key (_partition)

const waitInterval = 10;  // seconds
const batchSize = 1000;
var syncSession;


const Realm = require('realm');
const io = require('console-read-write');
const { exists } = require('realm');

const systemTables = ['BindingObject', '__Class', '__Permission', '__Realm', '__Role', '__User'];


function sleep(ms) {
    return new Promise(resolve => setTimeout(resolve, ms));
}

// Copy the object properties
// If it is an object then set the property to null - we update this in second cycle
// If it is a list then set the list to empty - we update these in second cycle 
var copyObject = function (obj, objSchema, targetRealm) {
    const copy = {};
    for (var key in objSchema.properties) {
        const prop = objSchema.properties[key];
        if (!prop.hasOwnProperty('objectType')) {
            copy[key] = obj[key];
        }
        else if (prop['type'] == "list") {
            copy[key] = [];
        }
        else {
            copy[key] = null;
        }
    }

    try {
        // Add this object to the target realm
        targetRealm.create(objSchema.name, copy);
    } catch (error) {
        console.log("error creating object: ", error);
    }

}

var getMatchingObjectInOtherRealm = function (sourceObj, source_realm, target_realm, class_name) {
    const allObjects = source_realm.objects(class_name);
    const ndx = allObjects.indexOf(sourceObj);

    // Get object on same position in target realm
    return target_realm.objects(class_name)[ndx];
}

var addLinksToObject = function (sourceObj, targetObj, objSchema, source_realm, target_realm) {
    for (var key in objSchema.properties) {
        const prop = objSchema.properties[key];
        if (prop.hasOwnProperty('objectType')) {
            if (prop['type'] == "list") {
                var targetList = targetObj[key];
                sourceObj[key].forEach((linkedObj) => {
                    const obj = getMatchingObjectInOtherRealm(linkedObj, source_realm, target_realm, prop.objectType);
                    if (obj) {
                        targetList.push(obj);
                    }

                });
            }
            else {
                // Find the position of the linked object
                const linkedObj = sourceObj[key];
                if (linkedObj === null) {
                    continue;
                }

                // Set link to object on same position in target realm
                targetObj[key] = getMatchingObjectInOtherRealm(linkedObj, source_realm, target_realm, prop.objectType);
            }
        }
    }
}
var querySourceRealm = async function () {
    console.log("querySourceRealm()")

    const target_realm = new Realm({ path: source_realm_path });

    const realm_schema = target_realm.schema;

    for (const objSchema of realm_schema) {
        const allObjects = target_realm.objects(objSchema['name']);

        console.log(objSchema['name'] + ": " + allObjects.length)

    }

    console.log("Done.");
    console.log("Enter to quit");
    await io.read();
    console.log("Ended");

}
var queryRealm = async function (app) {
    console.log("queryRealm()")


    const target_realm = await Realm.open({
        sync: {
            user: app.currentUser,
            partitionValue: partitionKey,
        },
    });
    console.log("realm opened.")

    target_realm.syncSession.addProgressNotification("upload", "reportIndefinitely", (transferred, tranferable) => console.log("UPLOAD: [" + transferred + "] " + tranferable));
    target_realm.syncSession.addProgressNotification("download", "reportIndefinitely", (transferred, tranferable) => console.log("DOWNLOAD: [" + transferred + "] " + tranferable));


    const realm_schema = target_realm.schema;

    for (const objSchema of realm_schema) {
        const allObjects = target_realm.objects(objSchema['name']);

        console.log(objSchema['name'] + ": " + allObjects.length)

    }

    console.log("Done");
    console.log("Enter to quit");
    await io.read();
    console.log("Ended");

}
var copyRealm = async function (app, local_realm_path) {
    console.log("copyRealm()")
    // Open the local realm
    const source_realm = new Realm({ path: local_realm_path });
    const source_realm_schema = source_realm.schema;

    // Create the new realm (with same schema as the source)

    var target_realm_schema = JSON.parse(JSON.stringify(source_realm_schema));

    var IX = target_realm_schema.findIndex(v => v.name === 'BindingObject')
    target_realm_schema.splice(IX, 1);
    IX = target_realm_schema.findIndex(v => v.name === '__Class')
    target_realm_schema.splice(IX, 1);
    IX = target_realm_schema.findIndex(v => v.name === '__Permission')
    target_realm_schema.splice(IX, 1);
    IX = target_realm_schema.findIndex(v => v.name === '__Realm')
    target_realm_schema.splice(IX, 1);
    IX = target_realm_schema.findIndex(v => v.name === '__Role')
    target_realm_schema.splice(IX, 1);
    IX = target_realm_schema.findIndex(v => v.name === '__User')
    target_realm_schema.splice(IX, 1);

    const target_realm = await Realm.open({
        schema: target_realm_schema,
        sync: {
            user: app.currentUser,
            partitionValue: partitionKey,
        },
    });
    console.log("realm opened.")

    target_realm.syncSession.addProgressNotification("upload", "reportIndefinitely", (transferred, tranferable) => console.log("UPLOAD: [" + transferred + "] " + tranferable));
    //target_realm.syncSession.addProgressNotification("download", "reportIndefinitely", (transferred, tranferable) => console.log("DOWNLOAD: [" + transferred + "] " + tranferable));


    // Copy all objects but ignore links for now
    for (const objSchema of source_realm_schema) {

        // Ignore the system tables (e.g. '__Class')
        if (!systemTables.includes(objSchema['name'])) {

            console.log("copying objects:", objSchema['name']);
            const allObjects = source_realm.objects(objSchema['name']);

            var index = 0;

            for (const obj of allObjects) {

                if (index == 0) {
                    target_realm.beginTransaction();
                }

                copyObject(obj, objSchema, target_realm)

                if (index == batchSize) {
                    target_realm.commitTransaction();
                    index = 0;
                    // Pause here trying to get Sync to break up data into small packets
                    if (waitInterval > 0) {
                        await sleep(waitInterval * 1000);
                    }
                } else {
                    index += 1;
                }

            };

            if (index > 0) {
                target_realm.commitTransaction();
                index = 0;
            }

            // Pause here trying to get Sync to break up data into small packets
            if (waitInterval > 0) {
                await sleep(waitInterval * 1000);
            }

        } else {
            console.log("skipping objects:", objSchema['name']);
        }
    };
    console.log("Object copy completed");

    if (waitInterval > 0) {
        console.log("Pausing...");
        await sleep(waitInterval * 1000);
        await sleep(waitInterval * 1000);
        await sleep(waitInterval * 1000);
        console.log(" continuing...");
    }

    // Do a second pass to add links
    for (const objSchema of source_realm_schema) {

        // Ignore the system tables (e.g. '__Class')
        if (!systemTables.includes(objSchema['name'])) {
            console.log("updating links in:", objSchema['name']);
            const allSourceObjects = source_realm.objects(objSchema['name']);
            const allTargetObjects = target_realm.objects(objSchema['name']);

            var index2 = 0;

            for (var i = 0; i < allSourceObjects.length; ++i) {
                const sourceObject = allSourceObjects[i];
                const targetObject = allTargetObjects[i];

                if (index2 == 0) {
                    target_realm.beginTransaction();
                }

                addLinksToObject(sourceObject, targetObject, objSchema, source_realm, target_realm);

                if (index2 == batchSize) {
                    target_realm.commitTransaction();
                    index2 = 0;
                    // Pause here trying to get Sync to break up data into small packets
                    if (waitInterval > 0) {
                        await sleep(waitInterval * 1000);
                    }
                } else {
                    index2 += 1;
                }
            }

            if (index2 > 0) {
                target_realm.commitTransaction();
                index2 = 0;
            }

            // Pause here trying to get Sync to break up data into small packets
            if (waitInterval > 0) {
                await sleep(waitInterval * 1000);
            }
        }


    };

    console.log("Links update completed");


}


async function run() {

    try {

        const app = new Realm.App({ id: appId });
        Realm.App.Sync.setLogLevel(app, "all");  // "all", "debug", "error", "info"
        Realm.App.Sync.setLogger(app, (level, message) => console.log("SYNC: [" + level + "] " + message));

        const credentials = new Realm.Credentials.emailPassword(username, password);

        await app.logIn(credentials);

        await copyRealm(app, source_realm_path);

        // Use this to query the new realm file
        // In theory delete the local folder called 'mongodb-realm' and run this and it should automatically download
        // data from the MongoDB Realm app instance in the cloud.

        //await queryRealm(app);

        //await querySourceRealm();

    } catch (error) {
        console.log("Error: ", error)
    }

    console.log("Enter to quit");
    io.write(await io.read());
    console.log("Waiting to finish...");

}



run().catch(err => {
    console.error("Failed to open realm:", err)
});

Version of Realm and Tooling

  • Realm JS SDK Version: 10.5.0
  • Node or React Native: 12.20.0
  • Client OS & Version: macOS 11
  • Which debugger for React Native: None
@RedBeard0531 RedBeard0531 changed the title Data load script Crash after 401 error Jan 15, 2021
RedBeard0531 added a commit that referenced this issue Jan 15, 2021
The problem with the fix in #3340 was that it was causing the
EventLoopDispatcher to be constructed on the sync thread rather than on the JS
thread, which is required. This alternative fix initializes it on the JS
thread, but ensures it is re-initialized when RN apps are reloaded.
@steffenagger
Copy link
Contributor

Hello @duncangroenewald, thank you for a highly detailed report.

We believe this is related to an issue (being fixed by @RedBeard0531, as you see by the mention).
Please follow #3505 for progress, once merged it should be in the following v10 release.

kneth added a commit that referenced this issue Jan 15, 2021
* Revert "recreate EventLoopDispatcher in NetworkTransport on hot reload in RN (#3340)"

This reverts commit 4c4e497.

* Alternative fix for #3236 that avoids causing #3503

The problem with the fix in #3340 was that it was causing the
EventLoopDispatcher to be constructed on the sync thread rather than on the JS
thread, which is required. This alternative fix initializes it on the JS
thread, but ensures it is re-initialized when RN apps are reloaded.

Co-authored-by: Kenneth Geisshirt <[email protected]>
@duncangroenewald
Copy link
Author

@steffenagger - I just tested with 10.1.3 and sync still fails. I will raise a new issue since failure mode seems a little different now.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants