Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: stash state lib #2944

Draft
wants to merge 121 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
121 commits
Select commit Hold shift + click to select a range
c399839
initial scaffold
alvrs Jul 15, 2024
04d1dda
add minimal benchmark setup for inserting into zustand
alvrs Jul 15, 2024
e06d45c
add recs benchmark for reference
alvrs Jul 15, 2024
a7ae895
add mutative benchmark
alvrs Jul 15, 2024
b2ece42
add benchmark for batch updates
alvrs Jul 16, 2024
8f052c4
drafting store API
alvrs Jul 17, 2024
ddd4b54
implement createStore
alvrs Jul 17, 2024
8caae95
iterate on api
alvrs Jul 17, 2024
94dd1ed
add attest setup and initial test case
alvrs Jul 17, 2024
807ff50
add tests for setRecord and getRecord
alvrs Jul 17, 2024
4b05fe0
add impl and test for registerTable, getTable and BoundTable
alvrs Jul 17, 2024
1d5d3a8
split into multiple files
alvrs Jul 17, 2024
ddfea89
Rename Record into TableRecord
alvrs Jul 17, 2024
c5e3397
typo
alvrs Jul 17, 2024
60d93e1
wip
alvrs Jul 18, 2024
6cbf8a7
first draft of runQuery
alvrs Jul 18, 2024
0badd04
add some tests for runQuery
alvrs Jul 18, 2024
a501f4a
add more tests for runQuery
alvrs Jul 18, 2024
bf707ad
add getInitialKeys to queryFragment type
alvrs Jul 18, 2024
9062655
ideas for more query fragments
alvrs Jul 18, 2024
c509799
wip
alvrs Jul 19, 2024
a487d52
add table subscriber implementation
alvrs Jul 19, 2024
30698be
add test for subscriptions
alvrs Jul 19, 2024
bb684cc
add test for unsubscribe
alvrs Jul 19, 2024
9a896ce
add defineQuery implementation
alvrs Jul 20, 2024
d847202
fix setRecord subscriber order
alvrs Jul 20, 2024
74e42cc
add deleteRecord and more tests
alvrs Jul 20, 2024
0eaef49
add getRecords
alvrs Jul 20, 2024
c310bdb
add option to return records to runQuery
alvrs Jul 20, 2024
c948d50
send update to initial subscribers
alvrs Jul 20, 2024
d9257fd
use types matching store config
alvrs Jul 22, 2024
084e026
accept mud config schema as input
alvrs Jul 22, 2024
d53a45e
add more types
alvrs Jul 22, 2024
f37d000
add setRecords
alvrs Jul 22, 2024
78e2746
add storage adapter
alvrs Jul 22, 2024
716b993
add storage adapter
alvrs Jul 23, 2024
d9b9583
add getTables
alvrs Jul 23, 2024
56f2e9c
add getTables test
alvrs Jul 23, 2024
4524d28
add syncToZustandQuery
alvrs Jul 23, 2024
734c508
don't require fields to be named the same for queries
alvrs Jul 23, 2024
aab867d
thinking
alvrs Jul 23, 2024
518bc38
feat: WIP recs shim
Kooshaba Jul 23, 2024
9d0aed1
add decoder for dozer response
alvrs Jul 24, 2024
528c1b3
rename dozer records
alvrs Jul 24, 2024
4435117
update protocol parser utils
alvrs Jul 24, 2024
4d16a78
add dozer sql utils
alvrs Jul 24, 2024
a51dbea
add helpers for gettting keys and values from record
alvrs Jul 25, 2024
988bbe2
add encoding / decoding helpers
alvrs Jul 25, 2024
3a325d1
wip - createStoreSync
alvrs Jul 25, 2024
218a8f9
fix selectFrom
alvrs Jul 26, 2024
531fec8
update type
alvrs Jul 28, 2024
9bede30
various fixes
alvrs Jul 31, 2024
8520a8e
wip new react template
alvrs Jul 31, 2024
8ed3a3c
add fetchInitialBlockLogsDozer
alvrs Jul 31, 2024
1bb493e
make filter optional
alvrs Jul 31, 2024
b21e7c7
add util to fetch logs from dozer
alvrs Jul 31, 2024
93edbe9
add minimal type benchmarks
alvrs Aug 5, 2024
cfe744c
rename tableLabel to table
alvrs Aug 5, 2024
2ec06e9
wip - refactor
alvrs Aug 6, 2024
8a8c6eb
split up createStore in implementations
alvrs Aug 7, 2024
aec1d2e
add free functions and test for API surface equivalence
alvrs Aug 7, 2024
8d25dd8
update query fragment interface
alvrs Aug 7, 2024
a46bff7
migrate query API
alvrs Aug 7, 2024
73e8ac8
fix tests
alvrs Aug 7, 2024
1b97bb3
fix: use updated core API
Kooshaba Aug 8, 2024
fe502fc
add benchmark for setting records
alvrs Aug 8, 2024
172e5b9
big refactor
alvrs Aug 8, 2024
1c46c81
finish refactor
alvrs Aug 8, 2024
8bdf989
update react shim
alvrs Aug 8, 2024
247dc0f
update storage adapter
alvrs Aug 8, 2024
533fed2
add more test cases
alvrs Aug 9, 2024
0be5a41
rename subscribe to subscribeTable to make room for subscribeStore
alvrs Aug 9, 2024
a3f6819
add subscribeStore and test
alvrs Aug 9, 2024
cc703df
chore: add mismatched tables in key mismatch error message
Kooshaba Aug 11, 2024
3a1352d
fix: add UpdateType to recs shim
Kooshaba Aug 11, 2024
434cd24
fix: use namespace name smaller than 14 bytes
Kooshaba Aug 11, 2024
ab43659
fix: fix import in bench script
Kooshaba Aug 11, 2024
37b3929
add react selector hook
alvrs Aug 9, 2024
60e9126
todo is done
alvrs Aug 12, 2024
f115738
strong types for common types
alvrs Aug 12, 2024
7110185
add types and type tests for setRecord
alvrs Aug 12, 2024
c250c7a
wip - thinking about requiring full table config
alvrs Aug 12, 2024
e3e8bfd
switch over setRecord to take in full table config
alvrs Aug 12, 2024
fbdd491
add types for setRecords
alvrs Aug 13, 2024
72655a2
add types for registerTable, getTable
alvrs Aug 13, 2024
30cdd74
move tests for registerTable, getTable
alvrs Aug 13, 2024
6a686d8
fix tests
alvrs Aug 13, 2024
034457c
add types for getTable
alvrs Aug 13, 2024
fd4a4cc
change test order
alvrs Aug 13, 2024
432b034
update encodeKey and decodeKey types, add decodeKey test
alvrs Aug 13, 2024
3839b8a
update more signatures to table
alvrs Aug 13, 2024
f3daf4d
make tests pass again
alvrs Aug 13, 2024
be85990
add deleteRecord test
alvrs Aug 13, 2024
85d9c84
add generic params to default action decorator
alvrs Aug 13, 2024
78eb841
add test for encodeKey
alvrs Aug 13, 2024
69859bf
add extend test
alvrs Aug 14, 2024
3682c1f
add test for getConfig
alvrs Aug 14, 2024
1c732f3
add getKeys test
alvrs Aug 14, 2024
7322d79
add getKeys tests
alvrs Aug 14, 2024
1af8961
add getRecords test
alvrs Aug 14, 2024
048f7d4
add getTables test
alvrs Aug 14, 2024
1d1dd57
add subscribeStore test
alvrs Aug 14, 2024
0eecbaa
add tests for subscribeStore
alvrs Aug 14, 2024
e5eb978
add tests for subscribeTable
alvrs Aug 14, 2024
2371048
feat(config,store): add strongly typed namespaceLabel to table config…
alvrs Aug 14, 2024
0d8fe0b
switch to namespace label
alvrs Aug 14, 2024
5d17ce9
type runQuery
alvrs Aug 14, 2024
a5536e5
fix some tests
alvrs Aug 14, 2024
c063e33
add subscribeQuery types
alvrs Aug 14, 2024
7ae1083
add types for bound subscribeQuery
alvrs Aug 14, 2024
5db9987
fix build issues
alvrs Aug 15, 2024
175c0bd
type table
alvrs Aug 15, 2024
9a02d5b
update useSelector type
alvrs Aug 15, 2024
4b46dc7
update bench
alvrs Aug 15, 2024
544c3f2
rename zustand-query to stash
alvrs Aug 15, 2024
6d4eba8
rename file to createStash
alvrs Aug 15, 2024
e8e0fe5
rename store references to stash
alvrs Aug 15, 2024
14ad493
stash type
alvrs Aug 15, 2024
e3c6d0b
fix build errors
alvrs Aug 15, 2024
7f4a600
fix: null check on sync progress
Kooshaba Aug 21, 2024
8c97a79
add rxjs wrapper
Kooshaba Aug 21, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions packages/config/src/common.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,20 @@ export type Schema = {
};
};

export type KeySchema = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if we're not gonna add this to table config output, this should probably live in protocol-parser for now

(I realize we have KeySchema there already but its outdated and we have a lot of work to migrate old helpers over to new shapes)

readonly [fieldName: string]: {
/** the Solidity primitive ABI type */
readonly type: StaticAbiType;
/** the user defined type or Solidity primitive ABI type */
readonly internalType: string;
};
};

export type Table = {
readonly label: string;
readonly type: satisfy<ResourceType, "table" | "offchainTable">;
readonly namespace: string;
readonly namespaceLabel: string;
readonly name: string;
readonly tableId: Hex;
readonly schema: Schema;
Expand Down
2 changes: 1 addition & 1 deletion packages/config/src/exports/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@
* Be sure we're ready to commit to these being supported and changes made backward compatible!
*/

export type { AbiType, StaticAbiType, DynamicAbiType, Schema, Table, Tables } from "../common";
export type { AbiType, StaticAbiType, DynamicAbiType, Schema, KeySchema, Table, Tables } from "../common";
3 changes: 3 additions & 0 deletions packages/protocol-parser/src/common.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import { Table } from "@latticexyz/config";
import {
DynamicAbiType,
SchemaAbiType,
Expand Down Expand Up @@ -51,3 +52,5 @@ export type ValueArgs = {
encodedLengths: Hex;
dynamicData: Hex;
};

export type PartialTable = Pick<Table, "schema" | "key">;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there a better name for this? maybe TableShape or TableSchema?

when I was using it as PartialTable, it was isolated/contained to a file and wasn't meant to be exported, so the name wasn't as meaningful

9 changes: 9 additions & 0 deletions packages/protocol-parser/src/decodeDozerField.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
import { describe, expect, it } from "vitest";
import { decodeDozerField } from "./decodeDozerField";

describe("decodeDozerField", () => {
it("should decode numbers to the expected value type", () => {
expect(decodeDozerField("uint48", "1")).toBe(1);
expect(decodeDozerField("uint56", "1")).toBe(1n);
});
});
24 changes: 24 additions & 0 deletions packages/protocol-parser/src/decodeDozerField.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
import { AbiType } from "@latticexyz/config";
import {
ArrayAbiType,
SchemaAbiTypeToPrimitiveType,
arrayToStaticAbiType,
schemaAbiTypeToDefaultValue,
} from "@latticexyz/schema-type/internal";

export function decodeDozerField<abiType extends AbiType>(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we talked about moving this over to store-sync (or near where its used) since its not part of the protocol

abiType: abiType,
data: string | boolean | string[],
): SchemaAbiTypeToPrimitiveType<abiType> {
const defaultValueType = typeof schemaAbiTypeToDefaultValue[abiType];
if (Array.isArray(data)) {
return data.map((element) => decodeDozerField(arrayToStaticAbiType(abiType as ArrayAbiType), element)) as never;
}
if (defaultValueType === "number") {
return Number(data) as never;
}
if (defaultValueType === "bigint") {
return BigInt(data) as never;
}
return data as never;
}
38 changes: 38 additions & 0 deletions packages/protocol-parser/src/decodeDozerRecords.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
import { describe, expect, it } from "vitest";
import { decodeDozerRecords } from "./decodeDozerRecords";

describe("decodeDozerRecord", () => {
const schema = {
address: { type: "address", internalType: "address" },
uint256: { type: "uint256", internalType: "uint256" },
uint32: { type: "uint32", internalType: "uint32" },
bool: { type: "bool", internalType: "bool" },
bytes: { type: "bytes", internalType: "bytes" },
string: { type: "string", internalType: "string" },
uint32Arr: { type: "uint32[]", internalType: "uint32[]" },
} as const;

it("decodes dozer record", () => {
const dozerRecord = [
"0x0000000000000000000000000000000000000000",
"1234",
"1234",
true,
"0x1234",
"hello world",
["1234", "5678"],
];
const decodedRecord = {
address: "0x0000000000000000000000000000000000000000",
uint256: 1234n,
uint32: 1234,
bool: true,
bytes: "0x1234",
string: "hello world",
uint32Arr: [1234, 5678],
};

const decoded = decodeDozerRecords({ schema, records: [dozerRecord] });
expect(decoded).toStrictEqual([decodedRecord]);
});
});
44 changes: 44 additions & 0 deletions packages/protocol-parser/src/decodeDozerRecords.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
import { Schema } from "@latticexyz/config";
import { decodeDozerField } from "./decodeDozerField";
import { getSchemaPrimitives } from "./getSchemaPrimitives";

type DozerQueryHeader = string[];
type DozerQueryRecord = (string | boolean | string[])[];

// First item in the result is the header
export type DozerQueryResult = [DozerQueryHeader, ...DozerQueryRecord[]];

export type DecodeDozerRecordsArgs = {
schema: Schema;
records: DozerQueryResult;
};

/**
* Trim the header row from the query result
*/
function trimHeader(result: DozerQueryResult): DozerQueryRecord[] {
return result.slice(1);
}

export type DecodeDozerRecordsResult<schema extends Schema = Schema> = getSchemaPrimitives<schema>[];

export function decodeDozerRecords<schema extends Schema>({
schema,
records,
}: DecodeDozerRecordsArgs): DecodeDozerRecordsResult<schema> {
const fieldNames = Object.keys(schema);
if (records.length > 0 && fieldNames.length !== records[0].length) {
throw new Error(
`Mismatch between schema and query result.\nSchema: [${fieldNames.join(", ")}]\nQuery result: [${records[0].join(", ")}]`,
);
}

return trimHeader(records).map((record) =>
Object.fromEntries(
Object.keys(schema).map((fieldName, index) => [
fieldName,
decodeDozerField(schema[fieldName].type, record[index]),
]),
),
) as never;
}
5 changes: 5 additions & 0 deletions packages/protocol-parser/src/exports/internal.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,11 @@ export * from "../schemaToHex";
export * from "../staticDataLength";
export * from "../valueSchemaToFieldLayoutHex";
export * from "../valueSchemaToHex";
export * from "../decodeDozerField";
export * from "../decodeDozerRecords";

export * from "../getKey";
export * from "../getValue";

export * from "../getKeySchema";
export * from "../getValueSchema";
Expand Down
25 changes: 25 additions & 0 deletions packages/protocol-parser/src/getKey.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
import { describe, expect, it } from "vitest";
import { getKey } from "./getKey";

describe("getKey", () => {
it("should return the key fields of the record", () => {
const table = {
schema: {
key1: { type: "uint32", internalType: "uint32" },
key2: { type: "uint256", internalType: "uint256" },
value1: { type: "string", internalType: "string" },
value2: { type: "string", internalType: "string" },
},
key: ["key1", "key2"],
} as const;
const record = { key1: 1, key2: 2n, value1: "hello", value2: "world" };
const key = getKey(table, record);

expect(key).toMatchInlineSnapshot(`
{
"key1": 1,
"key2": 2n,
}
`);
});
});
10 changes: 10 additions & 0 deletions packages/protocol-parser/src/getKey.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
import { getKeySchema } from "./getKeySchema";
import { getSchemaPrimitives } from "./getSchemaPrimitives";
import { PartialTable } from "./common";

export function getKey<table extends PartialTable>(
table: table,
record: getSchemaPrimitives<table["schema"]>,
): getSchemaPrimitives<getKeySchema<table>> {
return Object.fromEntries(table.key.map((fieldName) => [fieldName, record[fieldName]])) as never;
}
8 changes: 5 additions & 3 deletions packages/protocol-parser/src/getKeySchema.ts
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
import { Schema, Table } from "@latticexyz/config";
import { KeySchema, StaticAbiType, Table } from "@latticexyz/config";

type PartialTable = Pick<Table, "schema" | "key">;

export type getKeySchema<table extends PartialTable> = PartialTable extends table
? Schema
? KeySchema
: {
readonly [fieldName in Extract<keyof table["schema"], table["key"][number]>]: table["schema"][fieldName];
readonly [fieldName in Extract<keyof table["schema"], table["key"][number]>]: table["schema"][fieldName] & {
type: StaticAbiType;
};
};

export function getKeySchema<table extends PartialTable>(table: table): getKeySchema<table> {
Expand Down
25 changes: 25 additions & 0 deletions packages/protocol-parser/src/getValue.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
import { describe, expect, it } from "vitest";
import { getValue } from "./getValue";

describe("getValue", () => {
it("should return the key fields of the record", () => {
const table = {
schema: {
key1: { type: "uint32", internalType: "uint32" },
key2: { type: "uint256", internalType: "uint256" },
value1: { type: "string", internalType: "string" },
value2: { type: "string", internalType: "string" },
},
key: ["key1", "key2"],
} as const;
const record = { key1: 1, key2: 2n, value1: "hello", value2: "world" };
const value = getValue(table, record);

expect(value).toMatchInlineSnapshot(`
{
"value1": "hello",
"value2": "world",
}
`);
});
});
14 changes: 14 additions & 0 deletions packages/protocol-parser/src/getValue.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
import { PartialTable } from "./common";
import { getValueSchema } from "./getValueSchema";
import { getSchemaPrimitives } from "./getSchemaPrimitives";

export function getValue<table extends PartialTable>(
table: table,
record: getSchemaPrimitives<table["schema"]>,
): getSchemaPrimitives<getValueSchema<table>> {
return Object.fromEntries(
Object.keys(table.schema)
.filter((fieldName) => !table.key.includes(fieldName))
.map((fieldName) => [fieldName, record[fieldName]]),
) as never;
}
1 change: 1 addition & 0 deletions packages/stash/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# @latticexyz/stash
3 changes: 3 additions & 0 deletions packages/stash/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# Stash

High performance client store and query engine for MUD
67 changes: 67 additions & 0 deletions packages/stash/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
{
"name": "@latticexyz/stash",
"version": "2.0.12",
"description": "High performance client store and query engine for MUD",
"repository": {
"type": "git",
"url": "https://github.com/latticexyz/mud.git",
"directory": "packages/stash"
},
"license": "MIT",
"type": "module",
"exports": {
".": "./dist/index.js",
"./internal": "./dist/internal.js",
"./recs": "./dist/recs.js"
},
"typesVersions": {
"*": {
"index": [
"./dist/index.d.ts"
],
"internal": [
"./dist/internal.d.ts"
],
"recs": [
"./dist/recs.d.ts"
]
}
},
"files": [
"dist"
],
"scripts": {
"bench": "tsx src/bench.ts",
"build": "tsup",
"clean": "rimraf dist",
"dev": "tsup --watch",
"test": "vitest typecheck --run --passWithNoTests && vitest --run --passWithNoTests",
"test:ci": "pnpm run test"
},
"dependencies": {
"@arktype/util": "0.0.40",
"@latticexyz/config": "workspace:*",
"@latticexyz/protocol-parser": "workspace:*",
"@latticexyz/recs": "workspace:*",
"@latticexyz/schema-type": "workspace:*",
"@latticexyz/store": "workspace:*",
"mutative": "^1.0.6",
"react": "^18.2.0",
"rxjs": "7.5.5",
"viem": "2.9.20",
"zustand": "^4.3.7",
"zustand-mutative": "^1.0.1"
},
"devDependencies": {
"@arktype/attest": "0.7.5",
"@testing-library/react": "^16.0.0",
"@testing-library/react-hooks": "^8.0.1",
"@types/react": "18.2.22",
"react-dom": "^18.2.0",
"tsup": "^6.7.0",
"vitest": "0.34.6"
},
"publishConfig": {
"access": "public"
}
}
24 changes: 24 additions & 0 deletions packages/stash/src/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# TODOs

- Set up performance benchmarks for setting records, reading records, running queries, updating records with subscribers
- Replace objects with Maps for performance ? (https://rehmat-sayany.medium.com/using-map-over-objects-in-javascript-a-performance-benchmark-ff2f351851ed)
- could be useful for TableRecords, Keys
- maybe add option to include records in the query result?
- Maybe turn `entityKey` into a tagged string? So we could make it the return type of `encodeKey`,
and allow us to use Symbol (to reduce memory overhead) or something else later without breaking change
- add more query fragments - ie GreaterThan, LessThan, Range, etc
- we might be able to enable different key shapes if we add something like a `keySelector`
- getKeySchema expects a full table as type, but only needs schema and key

Ideas

- Update streams:
- if each query added a new subscriber to the main state, each state update would trigger _every_ subscriber
- instead we could add a subscriber per table, which can be subscribed to again, so we only have to iterate through all tables once,
and then through all subscribers per table (only those who care about updates to this table)
- Query fragments:
- Instead of pre-defined query types (Has, HasValue, Not, NotValue), could we define fragments in a self-contained way, so
it's easy to add a new type of query fragment without changing the core code?
- The main complexity is the logic to initialize the initial set with the first query fragment,
but it's probably not that critical - we could just run the first fragment on all entities of the first table,
unless an initialSet is provided.
28 changes: 28 additions & 0 deletions packages/stash/src/actions/decodeKey.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
import { describe, it } from "vitest";
import { createStash } from "../createStash";
import { defineStore } from "@latticexyz/store/config/v2";
import { setRecord } from "./setRecord";
import { encodeKey } from "./encodeKey";
import { attest } from "@ark/attest";
import { decodeKey } from "./decodeKey";

describe("decodeKey", () => {
it("should decode an encoded table key", () => {
const config = defineStore({
namespace: "namespace1",
tables: {
table1: {
schema: { field1: "string", field2: "uint32", field3: "uint256" },
key: ["field2", "field3"],
},
},
});
const stash = createStash(config);
const table = config.namespaces.namespace1.tables.table1;
const key = { field2: 1, field3: 2n };
setRecord({ stash, table, key, record: { field1: "hello" } });

const encodedKey = encodeKey({ table, key });
attest<typeof key>(decodeKey({ stash, table, encodedKey })).equals({ field2: 1, field3: 2n });
});
});
Loading
Loading