You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Cache file /root/.cache/paddle/dataset/imdb/aclImdb_v1.tar.gz not found, downloading http://ai.stanford.edu/%7Eamaas/data/sentiment/aclImdb_v1.tar.gz
[==================================================]Traceback (most recent call last):
File "fluid_benchmark.py", line 452, in
main()
File "fluid_benchmark.py", line 412, in main
train_args = list(model_def.get_model(args))
File "/Paddle/Paddle/benchmark/fluid/models/stacked_dynamic_lstm.py", line 113, in get_model
target_vars=[batch_acc, batch_size_tensor])
NameError: global name 'batch_size_tensor' is not defined
The text was updated successfully, but these errors were encountered:
When the stacked_dynamic_lstm is running, the error "global name 'batch_size_tensor' is not defined" occurs.
$ python fluid_benchmark.py --model stacked_dynamic_lstm --device GPU --gpus 1
----------- Configuration Arguments -----------
batch_size: 32
cpus: 1
data_format: NCHW
data_path:
data_set: flowers
device: GPU
gpus: 1
infer_only: False
iterations: 80
learning_rate: 0.001
memory_optimize: False
model: stacked_dynamic_lstm
no_test: False
pass_num: 100
profile: False
skip_batch_num: 5
update_method: local
use_cprof: False
use_fake_data: False
use_nvprof: False
use_reader_op: False
Cache file /root/.cache/paddle/dataset/imdb/aclImdb_v1.tar.gz not found, downloading http://ai.stanford.edu/%7Eamaas/data/sentiment/aclImdb_v1.tar.gz
[==================================================]Traceback (most recent call last):
File "fluid_benchmark.py", line 452, in
main()
File "fluid_benchmark.py", line 412, in main
train_args = list(model_def.get_model(args))
File "/Paddle/Paddle/benchmark/fluid/models/stacked_dynamic_lstm.py", line 113, in get_model
target_vars=[batch_acc, batch_size_tensor])
NameError: global name 'batch_size_tensor' is not defined
The text was updated successfully, but these errors were encountered: