SOTAVerified

Second-order methods

Use second-order statistics to process data.

Papers

Showing 101150 of 181 papers

TitleStatusHype
Sub-sampled Newton Methods with Non-uniform Sampling0
The Challenges of the Nonlinear Regime for Physics-Informed Neural Networks0
The Many Faces of Exponential Weights in Online Learning0
Unlocking FedNL: Self-Contained Compute-Optimized Implementation0
Utility Maximization for Large-Scale Cell-Free Massive MIMO Downlink0
Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee0
Extra-Newton: A First Approach to Noise-Adaptive Accelerated Second-Order Methods0
Faster Differentially Private Convex Optimization via Second-Order Methods0
FedNL: Making Newton-Type Methods Applicable to Federated Learning0
FedOSAA: Improving Federated Learning with One-Step Anderson Acceleration0
Fed-Sophia: A Communication-Efficient Second-Order Federated Learning Algorithm0
First and Second Order Methods for Online Convolutional Dictionary Learning0
GP-FL: Model-Based Hessian Estimation for Second-Order Over-the-Air Federated Learning0
GPU Accelerated Sub-Sampled Newton's Method0
Gradient-Boosted Based Structured and Unstructured Learning0
Gradient Norm Regularization Second-Order Algorithms for Solving Nonconvex-Strongly Concave Minimax Problems0
Hierarchical model-based policy optimization: from actions to action sequences and back0
Highly Efficient Hierarchical Online Nonlinear Regression Using Second Order Methods0
Implementation of a modified Nesterov's Accelerated quasi-Newton Method on Tensorflow0
Improving Stochastic Cubic Newton with Momentum0
Inverse-Free Fast Natural Gradient Descent Method for Deep Learning0
Jorge: Approximate Preconditioning for GPU-efficient Second-order Optimization0
KKT Conditions, First-Order and Second-Order Optimization, and Distributed Optimization: Tutorial and Survey0
Kronecker-Factored Approximate Curvature for Physics-Informed Neural Networks0
Kronecker-factored Quasi-Newton Methods for Deep Learning0
Krylov Cubic Regularized Newton: A Subspace Second-Order Method with Dimension-Free Convergence Rate0
Large Scale Empirical Risk Minimization via Truncated Adaptive Newton Method0
Learning-Augmented Sketches for Hessians0
Meta-descent for Online, Continual Prediction0
Minibatching Offers Improved Generalization Performance for Second Order Optimizers0
Mirror Prox Algorithm for Large-Scale Cell-Free Massive MIMO Uplink Power Control0
Near-Optimal Nonconvex-Strongly-Convex Bilevel Optimization with Fully First-Order Oracles0
Nesterov's Acceleration For Approximate Newton0
Nestrov's Acceleration For Second Order Method0
Newton methods based convolution neural networks using parallel processing0
Newton-Stein Method: An optimization method for GLMs via Stein's Lemma0
Newton-Stein Method: A Second Order Method for GLMs via Stein's Lemma0
On backpropagating Hessians through ODEs0
On the efficiency of Stochastic Quasi-Newton Methods for Deep Learning0
On the importance of initialization and momentum in deep learning0
On The Temporal Domain of Differential Equation Inspired Graph Neural Networks0
Oracle Complexity of Second-Order Methods for Finite-Sum Problems0
Practical Newton-Type Distributed Learning using Gradient Based Approximations0
Exact, Tractable Gauss-Newton Optimization in Deep Reversible Architectures Reveal Poor GeneralizationCode0
Learning Rates as a Function of Batch Size: A Random Matrix Theory Approach to Neural Network TrainingCode0
SGD momentum optimizer with step estimation by online parabola modelCode0
Fast and Furious Convergence: Stochastic Second Order Methods under InterpolationCode0
Statistical Inference of Constrained Stochastic Optimization via Sketched Sequential Quadratic ProgrammingCode0
SGD with Partial Hessian for Deep Neural Networks OptimizationCode0
Sharpened Lazy Incremental Quasi-Newton MethodCode0
Show:102550
← PrevPage 3 of 4Next →

No leaderboard results yet.