SOTAVerified

L2 Regularization

See Weight Decay.

$L_{2}$ Regularization or Weight Decay, is a regularization technique applied to the weights of a neural network. We minimize a loss function compromising both the primary loss function and a penalty on the $L_{2}$ Norm of the weights:

$$L_{new}\left(w\right) = L_{original}\left(w\right) + \lambda{w^{T}w}$$

where $\lambda$ is a value determining the strength of the penalty (encouraging smaller weights).

Weight decay can be incorporated directly into the weight update rule, rather than just implicitly by defining it through to objective function. Often weight decay refers to the implementation where we specify it directly in the weight update rule (whereas L2 regularization is usually the implementation which is specified in the objective function).

Papers

Showing 2650 of 128 papers

TitleStatusHype
Correlated Initialization for Correlated Data0
CtrTab: Tabular Data Synthesis with High-Dimensional and Limited Data0
Customers Churn Prediction in Financial Institution Using Artificial Neural Network0
An FPGA-Based On-Device Reinforcement Learning Approach using Online Sequential Learning0
Data-dependent Gaussian Prior Objective for Language Generation0
Analysis of High-dimensional Gaussian Labeled-unlabeled Mixture Model via Message-passing Algorithm0
A Bayesian encourages dropout0
Feature Representation for ICU Mortality0
Automatic Discovery and Optimization of Parts for Image Classification0
Derivative-based regularization for regression0
Deep Optimization model for Screen Content Image Quality Assessment using Neural Networks0
Attentive Recurrent Tensor Model for Community Question Answering0
Distribution-Dependent Sample Complexity of Large Margin Learning0
Automatic Parameter Tying in Neural Networks0
Dropout Regularization Versus _2-Penalization in the Linear Model0
dynamic Long Short-Term Memory Neural-Network-Based Indict Remaining-Useful-Life Prognosis for Satellite Lithium-ion Battery0
Edge-adaptive l2 regularization image reconstruction from non-uniform Fourier data0
Effectiveness of L2 Regularization in Privacy-Preserving Machine Learning0
Effect of the regularization hyperparameter on deep learning-based segmentation in LGE-MRI0
Electromyography Signal Classification Using Deep Learning0
Emergence of Implicit Filter Sparsity in Convolutional Neural Networks0
Emphasizing Unseen Words: New Vocabulary Acquisition for End-to-End Speech Recognition0
Empirical Study on Airline Delay Analysis and Prediction0
Exponentially Weighted l_2 Regularization Strategy in Constructing Reinforced Second-order Fuzzy Rule-based Model0
A MAX-AFFINE SPLINE PERSPECTIVE OF RECURRENT NEURAL NETWORKS0
Show:102550
← PrevPage 2 of 6Next →

No leaderboard results yet.