You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
this is not an issue but rather a basic question. I hope you can still help me.
I would like to store my data (locally) in chunks because I do not know the final size beforehand. So eventually, I want to store n training examples of 2d tensors of shape (m, l). The first dimension n is not known, so I would like to write chunks of say 512 training examples and resize the first dimension accordingly.
Once stored, I want to load a random batch for further use in Torch.
Thanks in advance for your help! :)
The text was updated successfully, but these errors were encountered:
It sounds like you're looking for the resize method. It'd initialize the store to some arbitrarily large dimensions and then resize with the resize_tied_bounds once you know the final extent of your store.
Here's a snippet of our C++ code that handles just that. We use the implicit dims as the lower bound because we expect everything to have an origin at zero.
Hi,
this is not an issue but rather a basic question. I hope you can still help me.
I would like to store my data (locally) in chunks because I do not know the final size beforehand. So eventually, I want to store
n
training examples of 2d tensors of shape (m
,l
). The first dimensionn
is not known, so I would like to write chunks of say 512 training examples and resize the first dimension accordingly.Once stored, I want to load a random batch for further use in Torch.
Thanks in advance for your help! :)
The text was updated successfully, but these errors were encountered: