Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
fix(pt): fix zero inputs for LayerNorm (#4134)
Fix #4064. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Bug Fixes** - Improved robustness of layer normalization by handling empty input tensors, ensuring consistent output without errors. <!-- end of auto-generated comment: release notes by coderabbit.ai --> Signed-off-by: Jinzhe Zeng <[email protected]>
- Loading branch information