-
Notifications
You must be signed in to change notification settings - Fork 516
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tests: skip attention-related parameterize when attn_layer is 0 #3784
Conversation
The tests makes no sense in this case. Signed-off-by: Jinzhe Zeng <[email protected]>
WalkthroughWalkthroughThe Changes
Recent Review DetailsConfiguration used: CodeRabbit UI Files selected for processing (1)
Additional comments not posted (4)
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configration File (
|
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## devel #3784 +/- ##
=======================================
Coverage 82.49% 82.49%
=======================================
Files 515 515
Lines 48642 48642
Branches 2980 2980
=======================================
Hits 40126 40126
Misses 7605 7605
Partials 911 911 ☔ View full report in Codecov by Sentry. |
…modeling#3784) The tests make no sense in this case. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Tests** - Improved test coverage by adding an optional `temperature` parameter to the attention layer tests. <!-- end of auto-generated comment: release notes by coderabbit.ai --> Signed-off-by: Jinzhe Zeng <[email protected]>
The tests make no sense in this case.
Summary by CodeRabbit
temperature
parameter to the attention layer tests.