-
Notifications
You must be signed in to change notification settings - Fork 112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Training with 1000 sparse labels #8
Comments
I have seen your problem before, it was usually caused by imbalanced classes. |
So basically it means that there are some labels with significantly more training data than others? |
|
Thank you, |
Just for understanding:
But how does this apply to the multilabel solution?
When you have a face, you probably always have female or male. This would mean, you have:
Would this be balanced or imbalanced? Or would you have to add 100 male and 100 female images without a visible face? |
The multi-label net actually works with 2^N expanded classes (just made that term up) where N is the number of your classes = the length of the In your example with males, females and faces, you want the net to learn to distinguish between the following 8 expanded classes:
You should ideally provide approximately the same number of examples from all of the expanded classes that you want to classify, which would be called having a balanced dataset. But that rarely happens, which is one of multiple reasons why you don't see many multi-label nets around, they are just too hard to train. This project is more of a toy to play with to understand whats going on in multi-label classification, if you want to do something serious you will definitely need to look under the hood to solve the many problems that will inevitably pop up. |
Great explanation, thank you! So i would need also a image class named "none" which has basically anything else which does not include any of the other classes? So it's probably easier to create 3 neural networks for each class and to aggregate the results of the positives:
2nd:
3rd:
|
No, include only positive examples.
I imagine that in this case it would work quite well, yes :) |
And this would work? I remember some kind of error message when I tried it a year ago like "multiple classes are needed for classification" |
I meant to include only images of classes that you want to train.
Of course you need to have more than 1 class, otherwise what would you be training. |
if the data need 8 expanded classes: why not use 8 single-label classes ,instead of 3 classes none |
Sure, you could train a single label classifier, but then you would not use the added information of actually having just 3 classes of objects. Ideally we want the network to learn to generalize. Meaning that it learns to recognize males, females and faces and even if it never saw an image with all three together it will correctly output >50% probabilities of all of them for such image. |
@BartyzalRadek Thank you.But i use your network to do multi-label classification, the net must be trainned with images [ male + face,female + face,male + female,male + female + face],if not it can't generalize.for example,i only feed images each only has 1 class of object. then i give a test image wihch has 3 classes of object,the net can't output >50% probabilities of all of them for such image. |
@BartyzalRadek |
@leminhlong5194 , did you ever solve this problem? I'm attempting to do this with over 1500 labels possible, however I have a pretty large and labeled dataset. I'm facing the issue of same predictions for each image so my next step is to set the weights for each label manually, hoping that solves the problem. |
i have the same issue. the prediction for the image is wrong. |
I was wondering whether you have tried to trained the model with 1000 sparse labels?
We have tried to trained the model with 750 labels. However, the predictions were not very good. Specifically. it will try to predict True for a constant set of labels, and low confidence for the rest. I would be very grateful if you can perhaps share some of your results with us.
The text was updated successfully, but these errors were encountered: