Skip to content Skip to sidebar Skip to footer

Tensorflow-io Dataset Input Pipeline With Very Large Hdf5 Files

I have very big training (30Gb) files. Since all the data does not fit in my available RAM, I want to read the data by batch. I saw that there is Tensorflow-io package which implem

Solution 1:

Providing the solution here (Answer Section), even though it is present in the Comment Section for the benefit of the community.

There was no issue with the code, its actually with the data (not preprocessed properly), hence model not able to learning well, which leads to strange loss and accuracy.

Post a Comment for "Tensorflow-io Dataset Input Pipeline With Very Large Hdf5 Files"