You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
train_dataset = TimeSeriesDataSet(
Final_DF[lambda x: x.time_idx <= training_cutoff],
time_idx="time_idx",
target="target",
group_ids=["group_id"],
max_encoder_length=max_encoder_length,
min_encoder_length=max_encoder_length//2,
max_prediction_length=max_prediction_length,
min_prediction_length=max_prediction_length,
static_reals=[], # Add any static features like group-level metadata
time_varying_known_reals=[
"time_idx",
"Dollar_Index",
"Gold_Price",
"Interest_Rate",
"US_10_Year",
"VIX_Value",
"month",
"day_of_week",
"year",
"month",
"day_of_year",
"quarter",
],
time_varying_unknown_categoricals=[],
time_varying_unknown_reals=["target", "lag_1", "lag_3", "lag_7", "ma_3", "ma_7"],
target_normalizer=GroupNormalizer(
groups=["group_id"], transformation="softplus"
),
add_relative_time_idx=True,
add_target_scales=True,
add_encoder_length=True,
)
# create validation set (predict=True) which means to predict the last max_prediction_length points in time
# for each series
validation = TimeSeriesDataSet.from_dataset(training, Final_DF, predict=True, stop_randomization=True)
# create dataloaders for model
batch_size = 128 # set this between 32 to 128
train_dataloader = training.to_dataloader(train=True, batch_size=batch_size, num_workers=0)
val_dataloader = validation.to_dataloader(train=False, batch_size=batch_size * 10, num_workers=0)
I'm trying to build a TFT model to predict cryptocurrency prices which is generalized across multiple cryptocurrencies. I'm having this AssertionError when creating the dataset.
here I have attached my what my dataset looks like.
The text was updated successfully, but these errors were encountered:
I'm trying to build a TFT model to predict cryptocurrency prices which is generalized across multiple cryptocurrencies. I'm having this AssertionError when creating the dataset.
here I have attached my what my dataset looks like.
The text was updated successfully, but these errors were encountered: