You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Note: Copy pasta is not enough, look at how I did for Bert and Quantized for inspiration. These also could be improved to be a cleaner interface.
Edit: I am refactoring to have a models API in a separate crate orca-models. This crate's goal is to provide an API for using models such as those provided by OpenAI as well as providing an API to easily use Candle transformer models. This should be the main point of development and should replace orca-core's models implementations.
The text was updated successfully, but these errors were encountered:
Learning from experience in another programming language ecosystem, it is good to have two separate libraries.
A) One library is highly beneficial for development and production environments. For each problem, there is one and generally best way to solve it.
B) The alternative is more favored by researchers for its wide range of packages. It includes all known algorithms, even those that are not commonly used.
Do we carefully select what gets included? Do we dump everything out there? Let's discuss it!
Right now we only have a small amount of supported models. Let's expand this! A good start would be to look into https://github.com/huggingface/candle/tree/main/candle-examples/examples and port over their examples into Orca.
Note: Copy pasta is not enough, look at how I did for Bert and Quantized for inspiration. These also could be improved to be a cleaner interface.
Edit: I am refactoring to have a models API in a separate crate
orca-models
. This crate's goal is to provide an API for using models such as those provided by OpenAI as well as providing an API to easily use Candle transformer models. This should be the main point of development and should replaceorca-core
's models implementations.The text was updated successfully, but these errors were encountered: