Skip to content

Commit

Permalink
Enable detection of arbitrary number of tokens in batches of 1000
Browse files Browse the repository at this point in the history
- Fixes #3661

Co-authored-by: Brian Bergeron <[email protected]>
  • Loading branch information
MajorLift and bergeron committed Feb 27, 2024
1 parent a09fcda commit f5c3194
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 5 deletions.
1 change: 1 addition & 0 deletions packages/assets-controllers/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- **BREAKING:** In Mainnet, even if the `PreferenceController`'s `useTokenDetection` option is set to false, automatic token detection is performed on the legacy token list (token data from the contract-metadata repo).
- **BREAKING:** The `TokensState` type is now defined as a type alias rather than an interface. ([#3690](https://github.com/MetaMask/core/pull/3690/))
- `TokensState` now extends the `Record` types, and it has an index signature of `string`, making it compatible with the `BaseControllerV2` state object constraint of `Record<string, Json>`.
- The `detectTokens` method can now process an arbitrary number of tokens in batches of 1000.

### Removed

Expand Down
8 changes: 3 additions & 5 deletions packages/assets-controllers/src/TokenDetectionController.ts
Original file line number Diff line number Diff line change
Expand Up @@ -566,11 +566,9 @@ export class TokenDetectionController extends StaticIntervalPollingController<
}

const slicesOfTokensToDetect = [];
slicesOfTokensToDetect[0] = tokensToDetect.slice(0, 1000);
slicesOfTokensToDetect[1] = tokensToDetect.slice(
1000,
tokensToDetect.length - 1,
);
for (let i = 0, size = 1000; i < tokensToDetect.length; i += size) {
slicesOfTokensToDetect.push(tokensToDetect.slice(i, i + size));
}

return slicesOfTokensToDetect;
}
Expand Down

0 comments on commit f5c3194

Please sign in to comment.