Cache limit is 10M items and the running will crash after that
For large dataset you can get the following error:
tensorflow.python.framework.errors_impl.InvalidArgumentError: Upstream iterator is producing more than 10000000 items, which is more than the cache limit. [[Node: IteratorGetNext = IteratorGetNext[output_shapes=[[?,4160,1], [?], [?]], output_types=[DT_FLOAT, DT_INT64, DT_STRING], _device="/job:localhost/replica:0/task:0/device:CPU:0"](OneShotIterator)]] [[Node: IteratorGetNext/_101 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device_incarnation=1, tensor_name="edge_20_IteratorGetNext", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]]