SOTAVerified

Distributed Optimization

The goal of Distributed Optimization is to optimize a certain objective defined over millions of billions of data that is distributed over many machines by utilizing the computational power of these machines.

Source: Analysis of Distributed StochasticDual Coordinate Ascent

Papers

Showing 276300 of 536 papers

TitleStatusHype
Collaborative Learning over Wireless Networks: An Introductory Overview0
Variance Reduction in Deep Learning: More Momentum is All You Need0
FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning0
A Semi-Distributed Interior Point Algorithm for Optimal Coordination of Automated Vehicles at Intersections0
Finite-Time Consensus Learning for Decentralized Optimization with Nonlinear Gossiping0
Basis Matters: Better Communication-Efficient Second Order Methods for Federated Learning0
Decentralized Feature-Distributed Optimization for Generalized Linear Models0
Cell Zooming with Masked Data for Off-Grid Small Cell Networks: Distributed Optimization Approach0
Parallel Feedforward Compensation for Output Synchronization: Fully Distributed Control and Indefinite Laplacian0
Acceleration in Distributed Optimization under Similarity0
A Reinforcement Learning Approach to Parameter Selection for Distributed Optimal Power Flow0
Utilizing Redundancy in Cost Functions for Resilience in Distributed Optimization and Learning0
Distributed Optimization of Graph Convolutional Network using Subgraph Variance0
Distributed Privacy-Preserving Electric Vehicle Charging Control Based on Secret Sharing0
KKT Conditions, First-Order and Second-Order Optimization, and Distributed Optimization: Tutorial and Survey0
Distributed Optimization using Heterogeneous Compute SystemsCode0
Communication-Efficient Federated Linear and Deep Generalized Canonical Correlation AnalysisCode0
Distributed Online Optimization with Byzantine Adversarial Agents0
Toward Communication Efficient Adaptive Gradient Method0
Asynchronous Iterations in Optimization: New Sequence Results and Sharper Algorithmic Guarantees0
On the Convergence of Decentralized Adaptive Gradient Methods0
The Minimax Complexity of Distributed Optimization0
Private Multi-Task Learning: Formulation and Applications to Federated LearningCode0
FL-MISR: Fast Large-Scale Multi-Image Super-Resolution for Computed Tomography Based on Multi-GPU Acceleration0
Dynamic communication topologies for distributed heuristics in energy system optimization algorithmsCode0
Show:102550
← PrevPage 12 of 22Next →

No leaderboard results yet.