Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not cast integer to float unintentionally #547

Merged
merged 4 commits into from
Nov 14, 2023
Merged

Do not cast integer to float unintentionally #547

merged 4 commits into from
Nov 14, 2023

Conversation

seanmor5
Copy link
Contributor

@seanmor5 seanmor5 commented Nov 14, 2023

Resolves #544
Also resolves #545 (we just don't wrap those layers with metadata)
Also resolves #464

The issue in 544 was that we were casting embedding layer integer inputs to f16, and then casting back to s64 which causes all sorts of issues because of loss of precision. Now we just never cast integer types at all. We get the same outputs now with f16:

%{
  results: [
    %{
      text: "[INST] <<SYS>>\nYou are a bot.\n<</SYS>>\n\nHi, bot![/INST]  Hello! *chirp* *winking emoji* I'm so glad you said hi to me! I'm just an AI bot, here to help answer your questions and provide some fun and interesting responses. What's on your mind? 🤖"
    }
  ]
}

@seanmor5 seanmor5 merged commit 2a434f3 into main Nov 14, 2023
5 checks passed
@seanmor5 seanmor5 deleted the cast branch November 14, 2023 22:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant