@mounika84 wrote:
I am using feed-forward neural network for a classification task. My data is 1 million examples for 9 classes (imbalanced). Due to memory constraint in Keras, I used function generator to generate batches automatically with a batch size of 200. I trained a simple model with 3 hidden layers with ReLU activation function. Input layer is 39 dimensional MFCCs and output is 9 classes. This model worked fine when I used a subset of this huge data (Sure!) but now, while using function generator i.e model.fit_generator, I see that training accuracy is just wandering around and validation accuracy is just too low. Looks like the model is not learning at all. What might be the possible reasons for this behaviour?
about data
Data : Speech
subset of huge data mentioned was totally clean but I generated 1 million examples from 1300 examples using data augmentation techniques like equalization, time stretch, time compression, noises, reverb etc
Posts: 6
Participants: 3