2 min read
|
Saved October 29, 2025
|
Copied!
Do you care about this?
Noisy labels can hinder the training of deep neural networks, leading to inaccuracies. The proposed $\epsilon$-softmax method modifies the softmax layer's outputs to approximate one-hot vectors with a controllable error, enhancing noise tolerance while maintaining a balance between robustness and effective learning through a combination with symmetric loss functions. Extensive experiments indicate its effectiveness in addressing both synthetic and real-world label noise.
If you do, here's more
Click "Generate Summary" to create a detailed 2-4 paragraph summary of this article.
Questions about this article
No questions yet.