I am becoming pretty used to Keras for image classification deep learning but still need control of the size and the time required to train and test my networks. One procedure I absolutely need to master in a short time is fit_generator() , which makes training / testing computations batch by batch instead of storing all the images in memory, 16GB RAM going away pretty fast. This would help me process very big datasets and bigger image sizes than the 64×64 pixels I am using now. DenseNet-121 is ok with several tricks and 5 k-folds already, but ResNet-50 and Inception-4 are not yet, both requiring 224×224 images.