Covid-19 and Pneumonia Classification

CLASSIFICATION

Covid-19 and Pneumonia Classification

I just finished with this project. Hopefully I did not make any big mistake.
The accuracy maybe not so high… need I to improve the model? Or go to the next challenge?

Link to GitHub

Link to Colaboratory

1 Like

I didn`t know about the DataIterator.image_shape property, so useful! I was having some issues with expected tensor expected shape because I was hardcoding the shape :confused: And one doubt I’ve still got is, why do models always perform better when we rescale to 1./255? isn’t the rescale supposed to be based on the actual image size? I think I’ve skimmed through the part where CC tells us why.

By the way, how did you perform?
I’ve got close to 0.8 accuracy and 0.4 loss. I’ll try to tweak things a bit more.

Im still on the stage where I have to understand how tuning hyperparams help the model perform better, but Ive figured that the sample size for this project is pretty small. Since you are doing so well, I would definetly move on but find something on kaggle with a bigger dataset! I`m guessing that they provided a small dataset so it wouldn’t break their server.

Anyway, here’s , my code:

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.preprocessing.image import ImageDataGenerator

from tensorflow.keras.callbacks import EarlyStopping
from tensorflow.keras import layers

import matplotlib.pyplot as plt
import app

generator = ImageDataGenerator( rotation_range=15, width_shift_range=0.2, zoom_range=0.2, height_shift_range=0.2, rescale=1./255)
batch = 10

train_iterator = generator.flow_from_directory('augmented-data/train',batch_size = batch,color_mode='grayscale', class_mode='sparse')
val_iterator = generator.flow_from_directory('augmented-data/test',batch_size = batch,color_mode='grayscale', class_mode='sparse')



callback = tf.keras.callbacks.EarlyStopping(monitor='accuracy', patience=3)

model = Sequential()
model.add(layers.Input(shape=train_iterator.image_shape))
model.add(layers.Conv2D(4, 3, activation='relu'))
model.add(layers.MaxPooling2D(pool_size=(3,3), strides=3))
model.add(layers.Conv2D(4, 3, activation='relu'))
model.add(layers.MaxPooling2D(pool_size=(3,3), strides=3))
model.add(layers.Flatten())
model.add(layers.Dense(16, activation='relu'))
model.add(layers.Dense(3,activation='softmax'))


model.compile(loss='sparse_categorical_crossentropy', optimizer=keras.optimizers.Adam(learning_rate=0.01), metrics=['accuracy'])
model.fit(train_iterator, epochs=1000, steps_per_epoch=100, validation_data=val_iterator,validation_steps=100, callbacks=[callback])

I got super high accuracy!
I used your code to troubleshoot mine and changed values, works great!
Highest accuracy is 0.8594, here’s the code:

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.callbacks import EarlyStopping
from tensorflow.keras import layers
from tensorflow.keras import layers
from tensorflow.keras.losses import categorical_crossentropy
data_generator = ImageDataGenerator(rescale=1.0/255, zoom_range=0.1, rotation_range=25, width_shift_range=0.05, height_shift_range=0.05)
batch = 4
training_iterator = data_generator.flow_from_directory('dataset/train',batch_size = batch,color_mode='grayscale', class_mode='sparse')
validation_iterator = data_generator.flow_from_directory('dataset/test',batch_size = batch,color_mode='grayscale', class_mode='sparse')
model = Sequential()
model.add(layers.Input(shape=training_iterator.image_shape))
model.add(layers.Conv2D(4, 3, activation='relu'))
model.add(layers.MaxPooling2D(pool_size=(3,3), strides=3))
model.add(layers.Conv2D(4, 3, activation='relu'))
model.add(layers.MaxPooling2D(pool_size=(3,3), strides=3))
model.add(layers.Flatten())
model.add(layers.Dense(16, activation='relu'))
model.add(layers.Dense(3,activation='softmax'))
model.compile(loss='sparse_categorical_crossentropy', optimizer=keras.optimizers.Adam(learning_rate=0.01), metrics=['accuracy'])
print(model.summary())
es = EarlyStopping(monitor='accuracy', patience=3)
model.fit(training_iterator, steps_per_epoch=16, epochs=64, validation_data=validation_iterator, validation_steps=16, callbacks=[es])


Also, after deleting the steps_per_epoch and validation_steps I got high accuracy most of the time.
The model makes itself better, and at the 13th epoch I got 0.9004!
At the beginning the accuracy was low (0.4) but it rose to the previously shown peak.