2 links tagged with all of: optimization + deep-learning + neural-networks
Click any tag below to further narrow down your results
Links
This article introduces Delta-Delta Learning (DDL), which enhances standard residual networks by applying a rank-1 transformation to the hidden state matrix. The Delta-Res block update combines the removal of old information with the addition of new data, controlled by a gate. Key components include a reflection direction, a value vector, and a gate parameter.
Efficient backpropagation (BP) is a fundamental technique in deep learning, first introduced by Seppo Linnainmaa in 1970, building on earlier concepts by Henry J. Kelley in 1960 and others. Despite its origins, BP faced skepticism for decades before gaining acceptance as a viable training method for deep neural networks, which can now efficiently optimize complex models. The article highlights the historical development of BP and addresses misconceptions surrounding its invention and application in neural networks.