Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrongly reporting mean class accuracy #7

Open
Nanne opened this issue Jun 5, 2014 · 2 comments
Open

Wrongly reporting mean class accuracy #7

Nanne opened this issue Jun 5, 2014 · 2 comments

Comments

@Nanne
Copy link

Nanne commented Jun 5, 2014

In the CNN demo's (cifar, housenumbers, and digit-classifier) the performance is reported in the form of the mean class accuracy. However this is done using confusion.totalValid, which is the total accuracy, not the mean class accuracy. The mean class accuracy can be found in confusion.averageValid.

totalValid is the sum of the diagonal of the confusion matrix divided by the sum of the matrix. averageValid is the average of all diagonals divided by their respective rows.

Is the intention to report the accuracy and is the text wrong or is the wrong variable used? My guess would be the latter.

@clementfarabet
Copy link
Member

That's a bug, the mean class accuracy is definitely in averageValid...

@waldoweng
Copy link

waldoweng commented Dec 11, 2017

and the code also need to call
confusion:updateValids()
before using confusion.totalValid

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants