You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Being able to do Prediction (task=prediction) on bin files.
Motivation
I need to analyze some stat on some huge files (train, validate, test) that are converted to bin format for speed purposes. Bin files seems to be very efficient. The time saved is considerable when you need to use those files several time. The loading time is in order of magnitude faster with bin vs csv. For info the file I use are huge, several TB .
Description
Being able to do
task=prediction
data = data_2004_2006_split_validate.csv.bin
Right now I get
[LightGBM] [Info] Finished loading parameters
[LightGBM] [Info] Finished initializing prediction, total used 4 iterations
[LightGBM] [Fatal] Unknown format of training data. Only CSV, TSV, and LibSVM (zero-based) formatted text files are supported.
Met Exceptions:
Unknown format of training data. Only CSV, TSV, and LibSVM (zero-based) formatted text files are supported.
Thanks!
-- w
The text was updated successfully, but these errors were encountered:
Thanks for the feature request. I've added it to #2302 along with the requests documenting the desire to predict on Dataset objects for the R package (#2666) and Python package (#6285).
Following this repo's policy for feature requests, I'm going to close this (since we track it over in #2302) until someone is actively working on it. Anyone reading this, just comment here if you'd like to work on this.
Summary
Being able to do Prediction (task=prediction) on bin files.
Motivation
I need to analyze some stat on some huge files (train, validate, test) that are converted to bin format for speed purposes. Bin files seems to be very efficient. The time saved is considerable when you need to use those files several time. The loading time is in order of magnitude faster with bin vs csv. For info the file I use are huge, several TB .
Description
Being able to do
Right now I get
Thanks!
-- w
The text was updated successfully, but these errors were encountered: