Can't write big one-element ndarray in HDF5 using set_attribute method

Created by: acostapazo

When I try to write a single-element numpy.ndarray to HDF5File with certain size (e.g 10000) e.g

import bob, numpy
f = bob.io.HDF5File("t.hdf5", "w")
a = numpy.random.rand(10000)
f.set_attribute("a", a)

I get the following error:

RuntimeError: HDF5File - set_attribute ('t.hdf5'): C++ exception caught: 'call to HDF5 C-function H5Acreate() returned error -1. HDF5 error statck follows: H5Acreate2() @ ../../../src/H5A.c+256: unable to create attribute H5A_create() @ ../../../src/H5A.c+505: unable to create attribute in object header H5O_attr_create() @ ../../../src/H5Oattribute.c+347: unable to create new attribute in header H5O_msg_append_real() @ ../../../src/H5Omessage.c+224: unable to create new message H5O_msg_alloc() @ ../../../src/H5Omessage.c+1945: unable to allocate space for message H5O_alloc() @ ../../../src/H5Oalloc.c+1142: object header message is too large'

If I use set method instead set_attribute, it works fine. e.g

import bob, numpy
f = bob.io.HDF5File("t.hdf5", "w")
a = numpy.random.rand(10000)
f.set("a", a)

Moreover, if I use a numpy with less size, I can use both methods without any problem.

e.g.

import bob, numpy
f = bob.io.HDF5File("t.hdf5", "w")
a = numpy.random.rand(1000)
f.set("a1", a)
f.set_attribute("a2", a)

The point is that it works well with small arrays and it exist some problem when the attribute it's bigger than certain value.