Noisy labels can hinder the training of deep neural networks, leading to inaccuracies. The proposed $\epsilon$-softmax method modifies the softmax layer's outputs to approximate one-hot vectors with a controllable error, enhancing noise tolerance while maintaining a balance between robustness and effective learning through a combination with symmetric loss functions. Extensive experiments indicate its effectiveness in addressing both synthetic and real-world label noise.
label-noise ✓
softmax ✓
machine-learning ✓
robustness ✓
deep-learning ✓