Hi everbody
I recently completed the Build Deep Learning Models with Tensorflow skill path.
This portfolio project helped me to understand deep learning concepts even more. In this portfolio project, I took care to apply everything I learned in this skill path in codecademy and I did it by taking into account the deep learning workflow. However, I had to skip the hyperparameter tuning because the data set was too large and my computer resources were not enough. Despite this, I achieved 90% accuracy on the test dataset just by doing the data cleaning and preprocessing steps carefully. If I get a more powerful computer in the future, I would like to go back to this project and do the hyperparameter tuning and increase the accuracy even more. I would appreciative if you give me feedback. Thanks in advance.
https://github.com/meleknurb/DL-Portfolio-Project
" Despite computational limitations preventing exhaustive hyperparameter tuning, this model provides a strong foundation for forest cover type classification. With additional fine-tuning and optimization, it has the potential to be deployed in real-world applications for forest management and ecological studies. Future iterations can focus on reducing overfitting while maintaining high accuracy.
Next Steps: If computational resources allow, I can revisit this project to refine hyperparameters and explore more advanced techniques. However, even in its current state, this model effectively differentiates between forest cover types with remarkable accuracy."
—> You can still use free Google colab CPU and set up checkpoints for the hyperparam phase. A small Random Search will do and you can leave the notebook running for hours and saving each iteration on Drive, for example, so you can always carry on where you left (when Google will disconnect your environment…) . Anyway, good job on your model !
1 Like
Thank you for your valuable comment. I will definitely try Google colab.
1 Like