Skip to content Skip to sidebar Skip to footer

How To Output Per-class Accuracy In Keras?

Caffe can not only print overall accuracy, but also per-class accuracy. In Keras log, there's only overall accuracy. It's hard for me to calculate the separate class accuracy. Epoc

Solution 1:

Precision & recall are more useful measures for multi-class classification (see definitions). Following the Keras MNIST CNN example (10-class classification), you can get the per-class measures using classification_report from sklearn.metrics:

from sklearn.metrics import classification_report
import numpy as np

Y_test = np.argmax(y_test, axis=1) # Convert one-hot to index
y_pred = model.predict_classes(x_test)
print(classification_report(Y_test, y_pred))

Here is the result:

         precision    recall  f1-score   support

      00.991.001.0098010.990.990.99113521.000.990.99103230.990.990.99101040.981.000.9998250.990.990.9989261.000.990.9995870.971.000.99102880.990.990.9997490.990.980.991009

avg / total   0.990.990.9910000

Solution 2:

You are probably looking to use a callback, which you can easily add to the model.fit() call.

For example, you can define your own class using the keras.callbacks.Callback interface. I recommend using the on_epoch_end() function since it will format nicely inside of your training summary if you decide to print with that verbosity setting. Please note that this particular code block is set to use 3 classes, but you can of course change it to your desired number.

# your class labels
classes = ["class_1","class_2", "class_3"]

classAccuracyCallback(tf.keras.callbacks.Callback):

    def__init__(self, test_data):
        self.test_data = test_data

    defon_epoch_end(self, epoch, logs=None):
        x_data, y_data = self.test_data

        correct = 0
        incorrect = 0

        x_result = self.model.predict(x_data, verbose=0)

        x_numpy = []

        for i in classes:
            self.class_history.append([])

        class_correct = [0] * len(classes)
        class_incorrect = [0] * len(classes)

        for i inrange(len(x_data)):
            x = x_data[i]
            y = y_data[i]

            res = x_result[i]

            actual_label = np.argmax(y)
            pred_label = np.argmax(res)

            if(pred_label == actual_label):
                x_numpy.append(["cor:", str(y), str(res), str(pred_label)])     
                class_correct[actual_label] += 1   
                correct += 1else:
                x_numpy.append(["inc:", str(y), str(res), str(pred_label)])
                class_incorrect[actual_label] += 1
                incorrect += 1print("\tCorrect: %d" %(correct))
        print("\tIncorrect: %d" %(incorrect))

        for i inrange(len(classes)):
            tot = float(class_correct[i] + class_incorrect[i])
            class_acc = -1if (tot > 0):
                class_acc = float(class_correct[i]) / tot

            print("\t%s: %.3f" %(classes[i],class_acc)) 

        acc = float(correct) / float(correct + incorrect)  

        print("\tCurrent Network Accuracy: %.3f" %(acc))

Then, you are going to want to configure your new callback to your model fit. Assuming your validation data (val_data) is some tuple pair, you can use the following:

accuracy_callback = AccuracyCallback(val_data)

# you can use the history if desired
history = model.fit( x=_, y=_, verbose=1, 
           epochs=_, shuffle=_, validation_data = val_data,
           callbacks=[accuracy_callback], batch_size=_
         )

Please note that the _ indicates values likely to change based on your configuration

Solution 3:

For train per-class accuracy: implement below on training dataset - after (and/or before) training on the dataset.


For raw per-class validation accuracy:
defper_class_accuracy(y_preds,y_true,class_labels):
    return [np.mean([
        (y_true[pred_idx] == np.round(y_pred)) for pred_idx, y_pred inenumerate(y_preds) 
      if y_true[pred_idx] == int(class_label)
                    ]) for class_label in class_labels]

defupdate_val_history():
    [val_history[class_label].append(np.mean( np.asarray(temp_history).T[class_idx] )
                             ) for class_idx, class_label inenumerate(class_labels)]

Example:

class_labels = ['0','1','2','3']
val_history = {class_label:[] for class_label in class_labels}

y_true   = np.asarray([0,0,0,0, 1,1,1,1, 2,2,2,2, 3,3,3,3])
y_preds1 = np.asarray([0,3,3,3, 1,1,0,0, 2,2,2,0, 3,3,3,3])
y_preds2 = np.asarray([0,0,3,3, 0,1,0,0, 2,2,2,2, 0,0,0,0])

y_preds1 = model.predict(x1)
temp_hist.append(per_class_accuracy(y_preds1,y_true,class_labels))
update_val_history()
y_preds2 = model.predict(x2)
temp_hist.append(per_class_accuracy(y_preds2,y_true,class_labels))
update_val_history()

print(val_history)

>>{ '0': [0.25, 0.50], '1': [0.50, 0.25], '2': [0.75, 1.00], '3': [1.00, 0.00] }

Post a Comment for "How To Output Per-class Accuracy In Keras?"