These are chat archives for dmlc/mxnet

Aug 2017
Daniel Haehn
Aug 09 2017 00:07
I was hoping that the NDArrayIter takes care of it
Andre Bieler
Aug 09 2017 06:08
train_iter =, label=your_label_array, batch_size=your_batch_size, shuffle=True/False)
be aware that the correct format for a convolution layer is data: (batch_size, channel, height, width), where I think in your case you have the channels as last index.
Daniel Haehn
Aug 09 2017 16:26

@abieler thank you! I swapped the axis but still no luck:

train_iter =, label=Y_train, batch_size=batch_size, shuffle=True)


TypeError: Invalid type '<type 'numpy.ndarray'>' for data, should be NDArray or numpy.ndarray
print type(X_train)
print type(Y_train)
print X_train.shape
print Y_train.shape
<type 'numpy.ndarray'>
<type 'numpy.ndarray'>
(80000, 6, 119, 119)
mx.__version__ = '0.10.0'
Daniel Haehn
Aug 09 2017 16:42

Funny, if I restrict the size

train_iter =[0:50000], label=Y_train, batch_size=batch_size, shuffle=True)

it works.. I did some tests with the NDArray constructor to figure this out:

a = mx.nd.array(X_train[0:50000]) # no problem
a = mx.nd.array(X_train[0:60000]) # fails
MXNetError: [16:37:29] include/mxnet/././tensor_blob.h:247: Check failed: this->shape_.Size() == shape.Size() (5097960000 vs. 802992704) TBlob.get_with_shape: new and old shape do not match total elements

Stack trace returned 10 entries:
[bt] (0) /home/dhaehn/D1/lib/python2.7/site-packages/mxnet/ [0x7f655a9510dc]
[bt] (1) /home/dhaehn/D1/lib/python2.7/site-packages/mxnet/ [0x7f655aa43680]
[bt] (2) /home/dhaehn/D1/lib/python2.7/site-packages/mxnet/ [0x7f655aa43b65]
[bt] (3) /home/dhaehn/D1/lib/python2.7/site-packages/mxnet/ [0x7f655b44647d]
[bt] (4) /home/dhaehn/D1/lib/python2.7/site-packages/mxnet/ [0x7f655b42329b]
[bt] (5) /home/dhaehn/D1/lib/python2.7/site-packages/mxnet/ [0x7f655b2f0a1a]
[bt] (6) /lib64/ [0x7f66065e3dcc]
[bt] (7) /lib64/ [0x7f66065e36f5]
[bt] (8) /home/dhaehn/D1/lib64/python2.7/lib-dynload/ [0x7f66067f6c8b]
[bt] (9) /home/dhaehn/D1/lib64/python2.7/lib-dynload/ [0x7f66067f0a85]

# but..
a = mx.nd.array(X_train[50000:80000]) # no problem
i have 200 GB free memory so there shouldn't be any issue