Skip to content Skip to sidebar Skip to footer

Keras Multiclass Classification Probabilities Do Not Sum Up To 1

When using the following Keras network to train and classify 9 classes: from keras.models import Model from keras.layers import Convolution1D, Input, Dropout, GlobalMaxPooling1D, D

Solution 1:

Based on the array that you posted, you have 9 categories. In such case, you should replace your final activation function with softmax instead of sigmoid. In addition, if you haven't done it yet, you need to transform your labels into one-hot vectors. You can do that using the function to_categorical. Finally, as a loss function, you should use categorical_crossentropy loss, instead of mse. A tutorial on using keras for classification (using the functions mentioned above) is provided here.

Solution 2:

In general, when you want to have an output similar to a categorical probability distribution you use a softmax activation function in the last layer instead of a sigmoid:

main_output = Dense(num_categories, activation='softmax', name='main_output')(drop_dense)

Post a Comment for "Keras Multiclass Classification Probabilities Do Not Sum Up To 1"