I’m almost through the first project in the deep learning skill path - here’s a link to it: Deep learning regression - Deepnote
I still struggle to understand though why when I try to perform a random or grid search through the hyperparameter space (i.e. trying to fit the GridSearchCV or RandomizedSearchCV to the features and label training data), the following error pops up: “NotFittedError: All estimators failed to fit”, without any helpful hints of what might be wrong. Anyone encountered this as well?
Also, using dropout appears to significantly worsen the model performance in this case? What is the intuition behind it why it is helpful in some cases and not in others?
More generally, I was wondering how to usually go about efficiently logging the results associated with different hyperparameter, performance metric and model architecture choices in deep learning? Is there perhaps a particular program ML engineers use to note their results?
Of course, any general feedback will be appreciated too!