Replies: 3 comments
-
@ginko3 Can you provide more details about which language bindings you're using and maybe some sample code demonstrating how you're currently loading it? @mxnet-label-bot add [question] |
Beta Was this translation helpful? Give feedback.
-
Sure, I create the python im2rec.py dataset_train.lst dataset/ --pack-label --num-thread 8 My Code looks like: num_gpus = 8
ctx = [mx.gpu(i) for i in range(num_gpus)]
data_train = mx.io.ImageRecordIter(
path_imgrec='dataset_train.rec',
data_shape=(3, 256, 256),
batch_size=16,
label_width = 22,
ctx=ctx
) When I print a batch, it shows as written in context |
Beta Was this translation helpful? Give feedback.
-
I don't think it is possible at the moment. The data always loaded first in RAM, and then you need to manually move it to the proper GPU memory with either If you can load the batch of 16 items in RAM, but would like to load more into GPUs before doing a forward pass, then I would:
Something like this (I haven't run the code myself, so it might need adjustments):
|
Beta Was this translation helpful? Give feedback.
-
I am running a configuration with a lot of vRAM (8 GPUs available) but very little RAM. I want to load a dataset of images saved in folders and in
rec
format. I am currently usingmx.io.ImageRecordIter
to load said rec file, but I can't specify the context and everything ends up loaded incpu_pinned(0)
.Is there a way to load images directly to my GPUs ?
Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions